Hi,
Thanks for your response.
Tried all approaches.
jdbc_jar_path = "abfss://container@storageacoount.dfs.core.windows.net/pipeline_dev/wms/infor-compass-jdbc-2023.10.jar" # Path in your storage account
ionapi_path = "abfss://container@storageacoount.dfs.core.windows.net/pipeline_dev/wms/Infor Compass JDBC Driver.ionapi" # Path in your storage account
key='xxxxx'
from notebookutils import mssparkutils
#from mssparkutils import fs
ionapi_file_path = "abfss://container@storageacoount.dfs.core.windows.net/pipeline_dev/wms/Infor Compass JDBC Driver.ionapi"
local_ionapi_path = "/tmp/Infor Compass JDBC Driver.ionapi"
mssparkutils.fs.cp(ionapi_file_path, local_ionapi_path)
//download_file method not available in fs package
from pyspark.sql import SparkSession
spark = SparkSession.builder \
.appName("InforDataLakeConnection") \
.config("spark.jars", jdbc_jar_path) \
.config("spark.executor.extraClassPath", "abfss://container@storageacoount.dfs.core.windows.net/pipeline_dev/wms/") \
.getOrCreate()
jdbc_properties = {
"user": "",
"password": "",
"driver": "com.infor.idl.jdbc.Driver",
"ionapi": local_ionapi_path
```}
jdbc_url="jdbc:infordatalake://tenant_id"
table_name = "xxxxx"
#
df = spark.read.jdbc(url=jdbc_url, table=table_name, properties=jdbc_properties)
after trying he above code we are facing the same issue in pyspark notebook.
the file must be located in the folder containing the Compass JDBC driver JAR file."
Is there any otherway we can call python script from synapse notebook level. Please suggest