Skip to content

ORA-01461: Can Bind LONG Value Only for Insert - Fix LOBs

ORA-01461: Can Bind a LONG Value Only for Insert into a LONG Column

Section titled “ORA-01461: Can Bind a LONG Value Only for Insert into a LONG Column”

Error Text: ORA-01461: can bind a LONG value only for insert into a LONG column

The ORA-01461 error occurs when an application attempts to bind a character string that exceeds 4000 bytes (or 32767 bytes in PL/SQL) into a column that is not defined as a LONG or LOB type. This error frequently surfaces in JDBC applications, ORM frameworks like Hibernate, or PL/SQL procedures that handle large text values without proper LOB handling. It can also appear when the NLS character set causes a multi-byte string to exceed column size limits.

  • A bound variable contains more than 4000 bytes for a VARCHAR2 column
  • Multi-byte NLS character sets cause apparent short strings to exceed limits
  • Application layer sending oversized text without truncation or validation
  • Old or mismatched JDBC thin driver versions misclassify large strings as LONG
  • Driver automatically promotes strings over a threshold to LONG type
  • ojdbc version incompatible with the database server version
  • Attempting to use a LONG value in a WHERE clause or expression
  • Binding LONG data into UPDATE or SELECT operations (only INSERT allowed)
  • PL/SQL variables of type LONG used in unsupported contexts
  • Hibernate or JPA mapping VARCHAR to a column storing large payloads
  • Entity field not annotated with @Lob when storing large text
  • ActiveRecord or similar frameworks not converting to CLOB/BLOB automatically
  • Database character set is AL32UTF8 and strings contain 3- or 4-byte characters
  • A string of 1500 characters becomes 4500+ bytes after encoding
  • Application passes character count rather than byte count to Oracle
-- Find LONG columns in the schema
SELECT owner, table_name, column_name, data_type, data_length
FROM dba_tab_columns
WHERE data_type = 'LONG'
AND owner NOT IN ('SYS', 'SYSTEM', 'SYSMAN', 'OUTLN')
ORDER BY owner, table_name, column_name;
-- Find columns that may store large text (VARCHAR2 near limit)
SELECT owner, table_name, column_name, data_type, char_length, data_length
FROM dba_tab_columns
WHERE data_type = 'VARCHAR2'
AND char_length > 3000
AND owner NOT IN ('SYS', 'SYSTEM', 'SYSMAN', 'OUTLN')
ORDER BY char_length DESC;
-- Verify database and national character sets
SELECT parameter, value
FROM nls_database_parameters
WHERE parameter IN ('NLS_CHARACTERSET', 'NLS_NCHAR_CHARACTERSET');
-- Check session NLS settings
SELECT parameter, value
FROM nls_session_parameters
WHERE parameter IN ('NLS_CHARACTERSET', 'NLS_LENGTH_SEMANTICS');
-- Test byte length vs character length difference
SELECT
'test_string' AS example,
LENGTHB('ABCdef') AS byte_length,
LENGTH('ABCdef') AS char_length
FROM dual;
-- Check for recent ORA-01461 occurrences in active session history
SELECT
sample_time,
session_id,
sql_id,
program,
module,
action
FROM dba_hist_active_sess_history
WHERE sample_time > SYSDATE - 1
AND program IS NOT NULL
ORDER BY sample_time DESC
FETCH FIRST 20 ROWS ONLY;
-- Find SQL statements bound to large values
SELECT sql_id, sql_text, executions, last_active_time
FROM v$sql
WHERE sql_text LIKE '%INSERT%'
AND last_active_time > SYSDATE - 1/24
ORDER BY last_active_time DESC;

The recommended fix for any LONG column is to migrate it to CLOB, which supports full DML operations without these restrictions.

-- Step 1: Add a new CLOB column
ALTER TABLE my_table ADD new_content CLOB;
-- Step 2: Copy data from LONG to CLOB
UPDATE my_table SET new_content = TO_LOB(old_long_column);
COMMIT;
-- Step 3: Drop the LONG column
ALTER TABLE my_table DROP COLUMN old_long_column;
-- Step 4: Rename CLOB column to original name
ALTER TABLE my_table RENAME COLUMN new_content TO old_long_column;
-- Verify migration
SELECT column_name, data_type
FROM user_tab_columns
WHERE table_name = 'MY_TABLE'
AND column_name = 'OLD_LONG_COLUMN';

When the column holds user-generated content that can grow beyond 4000 bytes:

-- Convert a VARCHAR2 column to CLOB
ALTER TABLE my_table ADD temp_clob CLOB;
UPDATE my_table SET temp_clob = description;
COMMIT;
ALTER TABLE my_table DROP COLUMN description;
ALTER TABLE my_table RENAME COLUMN temp_clob TO description;
-- Confirm data type
SELECT column_name, data_type, data_length
FROM user_tab_columns
WHERE table_name = 'MY_TABLE';
-- BAD: using VARCHAR2 for large content
CREATE OR REPLACE PROCEDURE save_content(p_text VARCHAR2) AS
BEGIN
INSERT INTO content_table(body) VALUES (p_text); -- Fails if p_text > 4000 bytes
COMMIT;
END;
/
-- GOOD: use CLOB parameter
CREATE OR REPLACE PROCEDURE save_content(p_text CLOB) AS
BEGIN
INSERT INTO content_table(body) VALUES (p_text);
COMMIT;
END;
/
-- GOOD: handle large in-memory strings with CLOB
DECLARE
v_content CLOB;
BEGIN
DBMS_LOB.CREATETEMPORARY(v_content, TRUE);
DBMS_LOB.WRITEAPPEND(v_content, LENGTH(large_string), large_string);
INSERT INTO content_table(body) VALUES (v_content);
COMMIT;
DBMS_LOB.FREETEMPORARY(v_content);
END;
/

Upgrade to a supported ojdbc version and configure the connection correctly:

// Use setClob instead of setString for large text
PreparedStatement pstmt = conn.prepareStatement(
"INSERT INTO content_table(body) VALUES (?)"
);
// For large strings, use setCharacterStream or setClob
java.io.Reader reader = new java.io.StringReader(largeText);
pstmt.setCharacterStream(1, reader, largeText.length());
pstmt.executeUpdate();
-- Add a check constraint to enforce column limits
ALTER TABLE my_table
ADD CONSTRAINT chk_content_length CHECK (LENGTHB(short_text) <= 4000);
-- PL/SQL validation before insert
CREATE OR REPLACE PROCEDURE safe_insert(p_text VARCHAR2) AS
BEGIN
IF LENGTHB(p_text) > 4000 THEN
RAISE_APPLICATION_ERROR(-20001,
'Text exceeds 4000 bytes: ' || LENGTHB(p_text) || ' bytes provided.');
END IF;
INSERT INTO my_table(short_text) VALUES (p_text);
COMMIT;
END;
/
-- Set column length semantics to BYTE explicitly
ALTER SYSTEM SET nls_length_semantics = BYTE SCOPE=BOTH;
-- Or per session
ALTER SESSION SET nls_length_semantics = BYTE;
-- Create columns with explicit BYTE or CHAR semantics
CREATE TABLE content_table (
id NUMBER GENERATED ALWAYS AS IDENTITY,
short_text VARCHAR2(4000 BYTE), -- Explicit byte limit
long_text CLOB, -- For unbounded text
CONSTRAINT pk_content PRIMARY KEY (id)
);
-- Regular audit to find remaining LONG columns
SELECT
owner,
table_name,
column_name,
'ALTER TABLE ' || owner || '.' || table_name ||
' MODIFY ' || column_name || ' CLOB;' AS suggested_fix
FROM dba_tab_columns
WHERE data_type = 'LONG'
AND owner NOT IN ('SYS', 'SYSTEM', 'OUTLN', 'MDSYS')
ORDER BY owner, table_name;
  • Annotate Java/Hibernate entity fields storing large content with @Lob @Column(columnDefinition = "CLOB")
  • Use @Size or custom validators to enforce byte limits before persistence
  • Enable SQL logging during development to catch oversized bind parameters early
  • Always use setClob() or setCharacterStream() for fields mapped to CLOB columns
  • Keep ojdbc driver version aligned with the database server version
  • Set oracle.jdbc.defaultLobPrefetchSize in connection properties to improve LOB performance
  • ORA-01401 - Inserted value too large for column
  • ORA-12899 - Value too large for column
  • ORA-00932 - Inconsistent datatypes in expression
  • ORA-01422 - Exact fetch returns more than one row