Data Science Asked by mozilla_firefox on December 15, 2020
I know that we can use Kaggle’s api directly in google colab which downloads the dataset. The commands are :
!mkdir .kaggle
!echo '{"username":"somename","key":"apikey"}' > /root/.kaggle/kaggle.json
!chmod 600 /root/.kaggle/kaggle.json
!kaggle competitions download -c bluebook-for-bulldozers -p /content
But I need to do this process of making .kaggle file and pass the apikey in google colab gpu everytime. And sometimes the echo command run saying no file called .kaggle but after say 2 mins without restarting the kernal, it works. It sounds funny but yes this is true. Can’t I just make a .kaggle file(only once) and use these commands just once and download the dataset just once and store it somewhere and use it even for later purposes. I have used google drive mount process but it’s hectic and takes lot of time uploading the datasets in drive. It would also be ok if I need to download the dataset everytime using just this command rather than making .kaggle file and writing the api key and username every time in it:
!kaggle competitions download -c bluebook-for-bulldozers -p /content
The success rates of previous commands are pretty less in one go and wastes lot of time.
Step 1 - Refer the Kaggle doc to understand the API basics and to get your key https://github.com/Kaggle/kaggle-api
Step 2 - Use these LoC -
import os
os.environ['KAGGLE_USERNAME'] = "jha01roshan"
os.environ['KAGGLE_KEY'] = "xxxxxxxxxxxxxxxx"
import kaggle
!kaggle competitions download -c ashrae-energy-prediction
import pandas as pd
building_metadata = pd.read_csv("/content/building_metadata.csv")
sample_submission = pd.read_csv("/content/sample_submission.csv.zip")
Answered by 10xAI on December 15, 2020
Get help from others!
Recent Answers
Recent Questions
© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP