Hope everyone is doing well...
We are exploring to see if it will be possible to organize a few of our jars as part of a folder in Workspace and have it moved around as part of the init scripts.
For example, in the workspace we have the following structure.
/Workspace/<Folder_Name1>/jars/sample_name_01.jar
The init script would attempt to move it to a path in DBFS/Driver Node File System.
!/bin/bash
cp /Workspace/<Folder_Name1>/jars/sample_name_01.jar /dbfs/jars/
cp /Workspace/<Folder_Name1>/jars/sample_name_01.jar /tmp/jars/
Of course the init script is failing with the error message
cp: cannot stat '/Workspace/<Folder_Name1>/jars/sample_name_01.jar': No such file or directory
Have tried with the path having both /Workspace included and removed. I have also tried accessing the file from the web terminal and I am able to see the files.
- Are workspace files accessible via init script ?
- Is there a limitation for jars and whl/egg files ?
- What is the right syntax to access them ?
- Does it make sense to have the jars (only few) as part of the workspace files or in DBFS ?
Thanks for all the help... Cheers...
Update 01:
Tried some of the suggestions received via other means...
- Considering that the init scripts from Workspace are referred without the
/WorkspaceI have also tried without them, but still the same issue. - Have also tried listing files and printing them. The path itself does not seem to get recognized.
- Have also tried sleeping for upto 2 minutes to give some time for mounts, still nothing...
First, check you have permissions to the workspace and jar folders. If you and still
cpis not working, below are the possible reasons.When admins upload jar files, there are two options.
Option 1 Below is how it done when it is uploaded as library.
After, it prompts for upload.
After clicking on create, below is the result.
Here you can see, it gives option to install on cluster, and Source, which is needed for you.
When uploading as library, you will get the jars in
dbfspath by default in the below location.Option 2
When it is uploaded as just file.
prompt for file upload and create.
Below are jars uploaded.
You use your copy command on jar, uploaded as file, that will work.
If you still get same error, then it is required permission. So, the possible solution for it is, you can run below code in notebook after cluster creation.
Note - This does not work if admins upload as library. As i mentioned above they will be available in dbfs only.