I'm tyring to write files for every cursor execution as shown in below sql. The data_payload column will have more than 1000 characters.
There is an exception at utl_file.put line. If I use SUBRTR of 1000 characters, then the file writes successfully. Is there any other alternate function to handle this?
SET SERVEROUTPUT ON
DECLARE
l_file_name VARCHAR2(100);
l_chr_payload VARCHAR2(4000);
fhandle utl_file.file_type;
CURSOR payload_cur IS
SELECT data_payload,
request_reference
FROM temp_tbl
WHERE message_type = 'ORDERCREATE'
AND ROWNUM < 2;
TYPE payload_typ IS
TABLE OF payload_cur%rowtype INDEX BY BINARY_INTEGER;
l_col_payload payload_typ;
BEGIN
OPEN payload_cur;
l_col_payload.DELETE;
FETCH payload_cur
BULK COLLECT INTO l_col_payload;
CLOSE payload_cur;
FOR i IN 1..l_col_payload.count LOOP
l_file_name := l_col_payload(i).request_reference
|| '_'
|| i
|| '.json';
dbms_output.put_line('l_file_name' || l_file_name);
fhandle := utl_file.fopen(
'TMP_DIR' -- File location
,
l_file_name -- File name
,
'w' -- Open mode: w = write.
);
l_chr_payload := substr(
l_col_payload(i).data_payload,
1,
1000
);
utl_file.put(
fhandle,
l_chr_payload
);
utl_file.fclose(fhandle);
END LOOP;
EXCEPTION
WHEN OTHERS
THEN
dbms_output.put_line('ERROR: '
|| sqlcode
|| ' - '
|| sqlerrm);
RAISE;
END;
/
Assuming that
temp_tbl.data_payloadis aCLOBdata type then you want to loop through the payload reading substring chunks (without needing to store the substrings in a temporary variable).If you want to write more than 1024 characters at one time then you need to specify the 4th argument to
UTL_FILE.FOPEN(location, filename, openmode, max_linesize)as the default is 1024.Something like this (untested as I don't have your tables or a directory to write to):