Topic starter 13/07/2021 12:15 pm
I have a lot of files in my s3 bucket, so is there any aws cli command which I can use to find the most recent file with a prefix name in s3?
How can I copy that file from s3 to my local folder?
Can I use Boto3 or python library to do this?
13/07/2021 3:50 pm
Here's how to do it in Python:
import boto3 s3_client = boto3.client('s3') response = s3_client.list_objects_v2(Bucket='MY-BUCKET', Prefix='foo/') objects = sorted(response['Contents'], key=lambda obj: obj['LastModified']) ## Latest object latest_object = objects[-1]['Key'] filename = latest_object[latest_object.rfind('/')+1:] # Remove path # Download it to current directory s3_client.download_file('MY-BUCKET', latest_object, filename)
Basically, you get back ALL objects, then sort them by LastModified.
Note: The list_objects_v2() command only returns a maximum of 1000 objects.
Full Download Function Example:
def download_file(BUCKET_NAME, PREFIX, FILE_NAME): try: s3 = boto3.client( 's3', aws_access_key_id=ACCESS_KEY, aws_secret_access_key=SECRET_KEY, region_name=REGION_NAME ) s3_list_response = s3.list_objects_v2( Bucket=BUCKET_NAME, Prefix=PREFIX ) formatted_response = sorted( s3_list_response['Contents'], key=lambda item: item['LastModified'])[-1] s3_file_key = formatted_response["Key"] s3_file_path = "s3://" + BUCKET_NAME + "/" + s3_file_key print(f"""DOWNLOADING FILE FROM {s3_file_path} TO {LOCAL_BASE_PATH}/{FILE_NAME}""") with open(f"{LOCAL_BASE_PATH}/{FILE_NAME}", 'wb') as data: s3.download_fileobj(BUCKET_NAME, s3_file_key, data) data.close() print(f"SUCCESS !!! {FILE_NAME} Successfully Downloaded By myTechMint") except Exception as error_1: print(f"ERROR !!! Not able to download {FILE_NAME} from AWS S3", error_1)
Neha liked
13/07/2021 3:52 pm
If you wanted to do it with AWS CLI
key=$(aws s3api list-objects --bucket MY-BUCKET --prefix foo/ --query 'sort_by(Contents, &LastModified)[-1].Key' --output text) aws s3 cp s3://MY-BUCKET/$key .