Using Azure Data Factory to read from Log Analytics Tables

1.4k views Asked by At

What connector in Azure Data Factory can we use to connect to Log Analytics Workspace? My requirement is to read the "AzureActivity" table & write to a storage account as a parquet file. The reason why I want to use ADF to read from Log Analytics Workspace tables rather than the Activity Log directly is that the Activity Log Json files are already parsed by the ETL process that runs in Log Analytics Workspace, and when we export the tables from there, the JSON output is not that raw & complex to read. Also, if I get the JSON file converted to Parquet, then I already have a Databricks pipeline which reads the parquet files and populates Delta Lake tables. On the top of Databricks Delta Lake tables, I will be producing reports on Azure Activity, and many more diagnostic logs in future.

1

There are 1 answers

0
Kranthi Pakala On

There is not direct/native connector to read data from Log Analytics. Possible option is to retrieve data from the Log Analytics REST APIs by using REST connector in ADF.

Here is an article by a community volunteer which has detailed steps on how you can get data from Log analytics tables using ADF.

Article: Retrieving Log Analytics Data with Data Factory

In case if you have complex JSON from your Log Analytics tables, then you may have to use Mapping Data flow instead of Copy activity to flatten the complex/nested JSON data.