Can Microsoft Fabric's Data Lakehouse handle geospatial data such as shp, geojson, and other formats?
I've tried getting the geopandas geodataframe into pyspark/parquet spark_df.write.parquet(destination_path) but I can't seem to get the data in.
So, is it possible, and are there any good tutorials out there specifically for geospatial data?
First read the geospatial data from the shp or geojson file and convert it to spark dataframe. To do this use below code block.
Next, writing data to onelake. Before writing make sure you enable Azure Data Lake Storage credential passthrough under Advance option.
Now, go to your one lake house > Files
create new folder. I've created a folder named geodata.
open properties of that folder and copy ABFS path.
Then run below code to write it.
It is written to onelake.
Again using the same link you can read it.
To convert
geomcolumn back to type geometry you can use below code.