I am working on a bash script (using jq for JSON parsing)
- that needs to make multiple CURL calls (response has same structure but different values), apply some logic/filters and then collate all the responses in one final JSON array of objects.
 - Loop through this final JSON array and write into a CSV in a predefined format.
 
Searched for a bit for both the requirements but could not find anything concrete. Please advice. The highlighted steps below (in ***) are the points where I need help.
Sample Flow :
create empty FINAL array
for(eachService is serviceList)
       a. CURL <service_response> returning JSON array of objects
       b. use jq filters to parse JSON response, apply some logic and modify elements in response as needed
       c. ***add this JSON array to FINAL array***
***LOOP through FINAL array, one object at a time a write to CSV.***
Sample Data :
CURL Response 1 (ex: $curl1):
[
  {
    "id":"123",
    "startDate": "2016-12-09T00:00:00Z",
    "calls":4
  },
  {
    "id":"456",
    "startDate": "2016-12-09T00:00:00Z",
    "calls":22
  }
]
CURL Response 2 (ex : $curl2): 
[
  {
    "id":"789",
    "startDate": "2016-12-09T00:00:00Z",
    "calls":8
  },
  {
    "id":"147",
    "startDate": "2016-12-09T00:00:00Z",
    "calls":10
  }
]
NEEDED OUTPUT ($final): 
[
{
    "id":"123",
    "startDate": "2016-12-09T00:00:00Z",
    "calls":4
  },
  {
    "id":"456",
    "startDate": "2016-12-09T00:00:00Z",
    "calls":22
  },
  {
    "id":"789",
    "startDate": "2016-12-09T00:00:00Z",
    "calls":8
  },
  {
    "id":"147",
    "startDate": "2016-12-09T00:00:00Z",
    "calls":10
  }
]
				
                        
jqcan deal with multiple input arrays. You can pipe the whole output of the loop to it:Note that the csv transformation can be done by
@csv