I am trying to use HugSQL to select some records from my DB and I get back the records I have few problems when I try this code:
f/attempt-all [_ (println "Getting DB records")
db-records (oracle-db/get-records my-db)
;hangs here when there are large(100,000k+) records
and
-- :name get-records
select /*+ index(src_ski_sysevents_au SRC_SKI_SYSEVENTS_AU_IDX_A)*/ * from src_ski_sysevents_au where week_id='201903' AND ROWNUM <'10000'
Actual results: db-records has a list of maps(map for each record returned) like this:
({:remarks nil, :timeprocessing 1994-01-01 00:00:00.000, :devicetype 3M, :staffcode 0M,
:timedeletion 1994-01-01 00:00:00.000, :occurred 1M, :syseventdesig Ticket jam, :time 2019-01-14 10:47:37.350, :syseventno 27M,
:devicedesig P4 LH Exit Wilga ST - 49, :devicenoprocessing 0M, :component nil, :lotabbr P3P4, :deviceabbr P4 LEX WIL,
:week_id 201903M, :lotdesig P3, P4 Levels, :dss_update_time #inst "2019-01-14T15:48:46.000000000-00:00", :loaded_new_yn N,
:operatorfirstname nil, :quantity 1M, :systemeventregno 7365M, :pdi_batch_id 697410M, :lotno 1M, :deviceno 49M, :deleted_yn N,
:centre_no 0012000, :lot_no 0012001, :operatorsurname unattended, :operatornoprocessing 0M}
...
)
I want to know how to convert this list of maps to a Json string file or CSV file to push it to my endpoint. While handling non-string values like
:timeprocessing 1994-01-01 00:00:00.000
I see errors for these as soon as I put this test string in to test in a (comment (def testmap {input_test_Data})) with invalid number format and I also found that
:dss_update_time in my DB is "15/JAN/19"
but in the HugSQL results I get something like this
:dss_update_time #inst "2019-01-14T15:48:46.000000000-00:00"
Basically how would I ensure I get back all the data Would really appreciate some help on the second part of my problem at least. Thank you
Please use the function
prninstead ofprintlnto print out a sample data row. Then you will see that the date values are actually strings (prnwill not strip away the double-quote chars likeprintlndoes).At that point, you can just use normal EDN->JSON conversion. There is a simple function in the Tupelo library:
which just uses the Cheshire lib under the covers and automatically does keyword<->string conversion.