9/5/2023 0 Comments Install spark locally![]() ![]() So I updated hail_jars = hail_home/'backend'/'hail-all-spark.jar' May I check why the path is different? have I installed it in a incorrect location because documentation mentions other location. I realize that I don’t have the folder build under hail_home which is causing issue while identifying java_package.īut the command in the doc, gives the below path for hail_home Is anything wrong with my hail_home path? TypeError: 'JavaPackage' object is not callable Sc = pyspark.SparkContext('local', 'Hail', conf=conf) Hail_jars = hail_home/'build'/'libs'/'hail-all-spark.jar' Later when I run in Jupyter notebook, the following commands hail_home = Path('/home/abcd/.pyenv/versions/3.7.2/envs/bio/lib/python3.7/site-packages/hail') home/abcd/.pyenv/versions/3.7.2/envs/bio/lib/python3.7/site-packages/hail ![]() I got the hail directory path which is like as shown below (pip3 show hail | grep Location | awk -F' ' '') So typing below command from the instructions documentation, I am used to working on Jupyter notebook and python. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |