Unable to find the path for uploading a file using streamlit Python code

Publish date: 2024-06-11

I'm writing a simple Python application where the user selects a file from their local file manager and tries to upload using streamlit.

I'm able to successfully take the file the user had given using streamlit.uploader and stored the file in a temp directory from the streamlit app folder but the issue is I can't give the path of the file of the file stored in the newly created directory in order to send the application into my gcp clouds bucket.

Adding my snippet below any help is appreciated :)

 import streamlit as st from google.oauth2 import service_account from google.cloud import storage import os from os import listdir from os.path import isfile, join from pathlib import Path from PIL import Image, ImageOps bucketName=('survey-appl-dev-public') # Create API client. credentials = service_account.Credentials.from_service_account_info( st.secrets["gcp_service_account"] ) client = storage.Client(credentials=credentials) #create a bucket object to get bucket details bucket = client.get_bucket(bucketName) file = st.file_uploader("Upload An file") def main(): if file is not None: file_details = {"FileName":file.name,"FileType":file.type} st.write(file_details) #img = load_image(image_file) #st.image(img, caption='Sunrise by the mountains') with open(os.path.join("tempDir",file.name),"wb") as f: f.write(file.getbuffer()) st.success("Saved File") object_name_in_gcs_bucket = bucket.blob(".",file.name) object_name_in_gcs_bucket.upload_from_filename("tempDir",file.name) if __name__ == "__main__": main() 

I've tried importing the path of the file using cwd command and also tried os library for file path but nothing worked.

Edited:
All I wanted to implement is make a file upload that is selected by customer using the dropbox of file_uploader option. I'm able to save the file into a temporary directory after the file is selected using the file.getbuffer as shown in the code but I couldn't make the code uploaded into the gcs bucket since its referring as str cannot be converted into int while I press the upload button.

Maybe its the path issue:

"the code is unable to find the path of the file stored in the temp directory"

but I'm unable to figure out how to give the path to the upload function.

Error coding I'm facing:

TypeError: '>' not supported between instances of 'str' and 'int' Traceback: File "/home/raviteja/.local/lib/python3.10/site-packages/streamlit/runtime/scriptrunner/script_runner.py", line 564, in _run_script exec(code, module.__dict__) File "/home/raviteja/test/streamlit/test.py", line 43, in <module> main() File "/home/raviteja/test/streamlit/test.py", line 29, in main object_name_in_gcs_bucket = bucket.blob(".",file.name) File "/home/raviteja/.local/lib/python3.10/site-packages/google/cloud/storage/bucket.py", line 795, in blob return Blob( File "/home/raviteja/.local/lib/python3.10/site-packages/google/cloud/storage/blob.py", line 219, in __init__ self.chunk_size = chunk_size # Check that setter accepts value. File "/home/raviteja/.local/lib/python3.10/site-packages/google/cloud/storage/blob.py", line 262, in chunk_size if value is not None and value > 0 and value % self._CHUNK_SIZE_MULTIPLE != 0: 
4

5 Answers

Thanks all for response after days of struggle at last I've figured out the mistake im making. I dont know if I'm right or wrong correct me if I'm wrong but this worked for me:

 object_name_in_gcs_bucket = bucket.blob("path-to-upload"+file.name) 

Changing the , to + between the filepath and filename made my issue solve.

Sorry for the small issue.

Happy that I could solve it.

 import easygui if st.button('Add file project.py'): file_absolut_path = easygui.fileopenbox(title='Add File', default="*.py") st.write(file_absolut_path) 
1

When you upload a file using the Streamlit file uploader, the file is temporarily stored in the computer's memory. This means that the file is not saved permanently on the server or on the user's computer. Instead, the file is only available for the duration of the Streamlit app session. The default size is 200MB, but that can be modified. You can read it directly but you can get only the name of the file and not its path as the file is memory.

To upload the files you don't need to get its path, try this:

file_data = uploaded_file.getvalue() if file_data: container_name = 'zipfile' blob_client = blob_service_client.get_blob_client(container=container_name, blob=uploaded_file.name) blob_client.upload_blob(file_data, overwrite=True) st.write("File uploaded successfully!") 

You have some variables in your code and I guess you know what they represent. Try this out else make sure you add every relevant information to the question and the code snippet.

def main(): file = st.file_uploader("Upload file") if file is not None: file_details = {"FileName":file.name,"FileType":file.type} st.write(file_details) file_path = os.path.join("tempDir/", file.name) with open(file_path,"wb") as f: f.write(file.getbuffer()) st.success("Saved File") print(file_path) def upload(): file_name = file_path read_file(file_name) st.write(file_name) st.session_state["upload_state"] = "Saved successfully!" object_name_in_gcs_bucket = bucket.blob("gcp-bucket-destination-path"+ file.name) object_name_in_gcs_bucket.upload_from_filename(file_path) st.write("Youre uploading to bucket", bucketName) st.button("Upload file to GoogleCloud", on_click=upload) if __name__ == "__main__": main() 
0

This one works for me.

Solution 1

import streamlit as st from google.oauth2 import service_account from google.cloud import storage import os STREAMLIT_SCRIPT_FILE_PATH = os.path.dirname(os.path.abspath(__file__)) credentials = service_account.Credentials.from_service_account_info( st.secrets["gcp_service_account"] ) client = storage.Client(credentials=credentials) def main(): bucketName = 'survey-appl-dev-public' file = st.file_uploader("Upload file") if file is not None: file_details = {"FileName":file.name,"FileType":file.type} st.write(file_details) with open(os.path.join("tempDir", file.name), "wb") as f: f.write(file.getbuffer()) st.success("Saved File") bucket = client.bucket(bucketName) object_name_in_gcs_bucket = bucket.blob(file.name) # src_relative = f'./tempDir/{file.name}' # also works src_absolute = f'{STREAMLIT_SCRIPT_FILE_PATH}/tempDir/{file.name}' object_name_in_gcs_bucket.upload_from_filename(src_absolute) if __name__ == '__main__': main() 

Solution 2

Instead of saving the file to disk, use the file bytes directly using upload_from_string().

References:

Google Cloud upload_from_string
Streamlit file uploader

credentials = service_account.Credentials.from_service_account_info( st.secrets["gcp_service_account"] ) client = storage.Client(credentials=credentials) def gcs_upload_data(): bucket_name = 'your_gcs_bucket_name' file = st.file_uploader("Upload file") if file is not None: fname = file.name ftype = file.type file_details = {"FileName":fname,"FileType":ftype} st.write(file_details) # Define gcs bucket. bucket = client.bucket(bucket_name) bblob = bucket.blob(fname) # Upload the bytes directly instead of a disk file. bblob.upload_from_string(file.getvalue(), ftype) if __name__ == '__main__': gcs_upload_data() 
1

ncG1vNJzZmirpJawrLvVnqmfpJ%2Bse6S7zGiorp2jqbawutJobm1tZ2mAenyOrqWampyaerW7jJ%2Bgp5xdqbWmec%2Baq6FllqS%2FbsHPpaaanJmjtG6tjJ%2BgpZ1dqsCqusZmqq2qlZa6rbXTZqeyrJiku26vzp2c