oracletospark.io · Free Need full migration help?

Convert Oracle SQL to Databricks SQL

The free Oracle → Databricks SQL converter. Handles ROWNUM, CONNECT BY, NVL, DECODE, SYSDATE, sequences, and 20+ Oracle-specific functions. No signup.

✓ No signup ✓ No data stored ✓ Instant
Load a sample query:
Ctrl+Enter to convert
Oracle SQL

Paste any Oracle SQL — SELECT, INSERT, DDL, stored procedure fragments

Databricks / Spark SQL
Conversion Notes

Conversion details will appear here after converting...

SQL Conversion Reference

Complete reference for Oracle → Spark SQL conversions. Bookmark for quick lookup during your migration.

Oracle Spark SQL / Databricks Notes
SYSDATECURRENT_TIMESTAMPDirect replacement
SYSTIMESTAMPCURRENT_TIMESTAMPDirect replacement
NVL(x, y)COALESCE(x, y)Identical behavior
NVL2(x, y, z)IF(x IS NOT NULL, y, z)Evaluates whether x is NOT NULL
DECODE(col, v1, r1, …, def)CASE WHEN col=v1 THEN r1 … ELSE def ENDLast odd argument becomes ELSE
TO_DATE(str, fmt)TO_DATE(str, fmt)Format syntax differs: Oracle YYYY → Spark yyyy, DD → dd
TO_CHAR(date, fmt)DATE_FORMAT(date, fmt)Format strings differ significantly; verify each format mask
TO_NUMBER(x)CAST(x AS DECIMAL)Specify precision/scale if known
TO_CLOB(x)CAST(x AS STRING)Spark STRING is unbounded
VARCHAR2(n)STRINGNo length limit needed in Spark
NUMBER(p, s)DECIMAL(p, s)With precision and scale
NUMBERBIGINTWithout precision; use DECIMAL for fractional values
CLOBSTRINGSpark STRING has no size limit
BLOBBINARYBinary data type
DATE (type)TIMESTAMPOracle DATE stores time; Spark DATE is date-only. Use TIMESTAMP to preserve time.
FROM DUALFROM (SELECT 1)Oracle dummy table for scalar queries
MINUSEXCEPTSet subtraction operator
x || yCONCAT(x, y)Spark also supports || natively
SUBSTR(x, y, z)SUBSTRING(x, y, z)Same argument order
INSTR(str, sub)LOCATE(sub, str)Argument order is reversed!
LENGTHB(x)LENGTH(x)Oracle byte count → Spark character count
TRUNC(date)DATE_TRUNC('day', date)Truncate to day boundary
TRUNC(number)TRUNCATE(number, 0)Truncate to integer
MOD(x, y)x % yModulo operator
CEIL(x)CEILING(x)Ceiling function
LPAD / RPADLPAD / RPADIdentical in Spark
WHERE ROWNUM <= NLIMIT NMoved to end of query
ROWIDmonotonically_increasing_id()No direct equivalent; consider row_number() OVER()
seq.NEXTVAL— (manual)Use Delta sequences or monotonically_increasing_id()
seq.CURRVAL— (not supported)No equivalent; requires redesign
CONNECT BY PRIORWITH RECURSIVE cte AS (…)Hierarchical queries must be rewritten as recursive CTEs
table.col = other.col(+)LEFT JOINOracle (+) outer join → ANSI JOIN syntax
MERGE INTO … USINGMERGE INTO … USINGDelta Lake supports MERGE; verify MATCHED/NOT MATCHED syntax
CREATE OR REPLACECREATE OR REPLACESupported in Delta Lake
DBMS_OUTPUT.PUT_LINEprint() / display()Use Python/Scala notebook output functions
EXCEPTION / WHEN OTHERStry/except (Python)PL/SQL exception blocks → Python/Scala wrapper code

Need Help With the Full Migration?

Oracle-to-Databricks migrations typically take 3–6 months with traditional consultants. Our AI-powered approach cuts that to 3–6 weeks.

No commitment · Usually respond within 24 hours

✓ Thanks! We'll be in touch within 24 hours.