How to automatically upload device media files to AWS S3?

How to stream device media files to AWS S3 cloud storage

The S3 Media Upload stream automatically detects and uploads media files (photos, videos, tachograph files) from flespi device messages to S3 bucket with configurable path structure, providing reliable cloud storage for telematics media data. 

The stream currently supports upload to AWS S3 storage only and can be enhanced with other S3 storage providers in the future.

For each device message that contains media files (media.image.X, media.video.X, media.tacho.X) stream uploads them all into S3 bucket applying configurable object prefix. All such upload attempts are reported in stream log inside data parameter.

For each file upload attempt with stream log record data parameter you may find next fields:

  • result: upload status - true/false;
  • message_parameter: device message parameter name from which this media information was extracted;
  • media: original device media information;
  • s3: upload information provided by S3 with bucket, key, region and url properties;

By subscribing to stream logs with webhook you can process successful file uploads and perform any extra actions such as automatic media file removal from flespi and/or enhancing device message with a link to s3 storage.

How to use

Go to Telematics hub -> Streams -> green “+” button.

Give your stream a name.

Pick the “media-upload-s3” protocol ID.

Set "media==true" in stream's validate_message field to handle only messages with media files attached.

Enter AWS S3 configuration properties such as region, bucket, access key id and secret key. Please follow AWS configuration guide to obtain them.

Prefix Templating (Optional): The prefix field supports dynamic placeholders: {device_id}, {device_name}, {ident}, {year}, {month}, {day}. Default is "{device_id}/{year}/{month}/{day}". Examples:

  • "fleet/{ident}/media""fleet/123456789012345/media"
  • "cameras/{device_name}/{year}-{month}""cameras/Fleet_Truck_01/2024-08"

Click Save and the new stream will appear in the streams list.

Now subscribe stream to device suitable to report media files: video or tacho enabled. If device already has media files reported in messages you can navigate to device LOG & MESSAGES, enter "media == true" in messages filter, open context menu over a message with right click and select "Re-register message" to re-publish it again into the device. Stream will see the re-published message and perform its processing.

Analyze stream logs to troubleshoot success or failure in media file upload.

To simplify media upload testing we suggest to use Telegram channel and device as the simplest device in flespi suitable to report media files.

Accessing Uploaded Files
Stream logs contain the complete S3 URL in data[].s3.url field. Files are private by default. For public access, follow the AWS configuration guide instructions on making uploaded files publicly readable.

AWS S3 Configuration

Your objective is to configure AWS S3 storage and retrieve S3 Bucket Name, Bucket Region, AWS Access Key ID and AWS Secret Access Key values with which you may create flespi stream.

The best way to configure AWS S3 upload is to start a session with modern LLM of your choice (Gemini, ChatGPT, Claude, etc) and go through it. Feed it with exactly this task description:

Objective: Generate a comprehensive, step-by-step guide for a user with a root AWS account to configure the necessary AWS resources for secure, programmatic file uploads to an S3 bucket.

Core Requirements: Your guide must instruct the user on how to perform the following actions from scratch using the AWS Management Console:

  1. Create a dedicated S3 Bucket: Detail the process of creating a new S3 bucket in a specific region.

  2. Create a new IAM User: Explain how to create a new IAM user with programmatic access to generate an Access Key ID and a Secret Access Key.

  3. Define and Attach a Least-Privilege IAM Policy: Provide instructions to create and attach a custom IAM policy for the new user. This policy must grant only the s3:PutObject permission (for uploading files) and s3:ListBucket permission (for bucket availability testing) and be scoped specifically to the S3 bucket created in step 1.

  4. Summarize Critical Information: Conclude the core setup by creating a clear summary section that lists the essential pieces of information the user will need for any application or script: S3 Bucket Name, Bucket Region, AWS Access Key ID, AWS Secret Access Key.

Secondary Objective (Optional Step):

After detailing the core setup, add a distinct section explaining how to make the uploaded files publicly readable. This section should cover:

  1. Editing the S3 bucket's "Block Public Access" settings.

  2. Applying a bucket policy that grants s3:GetObject permissions to the public (Principal: "*").

And use LLM directly for any subsequent questions. Or you are welcome to use directly the step-by-step AWS S3 upload configuration guide below.

AWS S3 Upload Configuration Guide

This guide will walk you through setting up secure, programmatic file uploads to an S3 bucket using the AWS Management Console with least-privilege security principles.

Prerequisites

  • Root AWS account access
  • Access to AWS Management Console

Step 1: Create a Dedicated S3 Bucket

1.1 Navigate to S3 Service

  1. Sign in to the AWS Management Console
  2. In the search bar at the top, type "S3" and select Amazon S3
  3. Click Create bucket

1.2 Configure Basic Settings

  1. Bucket name: Enter a globally unique bucket name (e.g., my-app-uploads-2024)
    • Must be 3-63 characters long
    • Can contain lowercase letters, numbers, and hyphens
    • Must start and end with a letter or number
  2. AWS Region: Select your preferred region (e.g., US East (N. Virginia) us-east-1)
    • Note this region for later use

1.3 Configure Bucket Settings

  1. Object Ownership: Leave as "ACLs disabled (recommended)"
  2. Block Public Access settings: Keep all four options checked (recommended for security)
  3. Bucket Versioning: Choose "Disable" (unless you specifically need versioning)
  4. Default encryption:
    • Select "Server-side encryption with Amazon S3 managed keys (SSE-S3)"
    • Leave other encryption settings as default

1.4 Create the Bucket

  1. Review your settings
  2. Click Create bucket
  3. Important: Note down the exact bucket name and region for later use

Step 2: Create a New IAM User

2.1 Navigate to IAM Service

  1. In the AWS Console search bar, type "IAM" and select IAM
  2. In the left sidebar, click Users
  3. Click Create user

2.2 Configure User Details

  1. User name: Enter a descriptive name (e.g., s3-upload-user)
  2. Provide user access to the AWS Management Console: Leave unchecked (we only want programmatic access)
  3. Click Next

2.3 Set Permissions (Temporary)

  1. Select Attach policies directly
  2. For now, don't attach any policies (we'll create a custom policy later)
  3. Click Next

2.4 Review and Create User

  1. Review the user details
  2. Click Create user
  3. You'll see a success message with the username

Step 3: Generate Access Keys for Programmatic Access

3.1 Access the New User

  1. From the Users list, click on the username you just created
  2. Click on the Security credentials tab
  3. Scroll down to Access keys section
  4. Click Create access key

3.2 Configure Access Key

  1. Use case: Select Application running outside AWS
  2. Check the confirmation checkbox
  3. Click Next

3.3 Set Description (Optional)

  1. Description tag: Enter something descriptive like "S3 upload application"
  2. Click Create access key

3.4 Retrieve Credentials

  1. Critical: Copy and securely store both:
    • Access Key ID (starts with "AKIA...")
    • Secret Access Key (long alphanumeric string)
  2. Click Download .csv file for backup
  3. Click Done

⚠️ Security Warning: This is the only time you'll see the Secret Access Key. Store it securely and never share it publicly.

Step 4: Create a Least-Privilege IAM Policy

4.1 Navigate to Policies

  1. In the IAM dashboard, click Policies in the left sidebar
  2. Click Create policy

4.2 Define Policy Permissions

  1. Click the JSON tab
  2. Replace the default content with the following policy (replace YOUR-BUCKET-NAME with your actual bucket name):
{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Effect": "Allow",
            "Action": [
                "s3:PutObject",
                "s3:ListBucket"
            ],
            "Resource": [
                "arn:aws:s3:::YOUR-BUCKET-NAME/*",
                "arn:aws:s3:::YOUR-BUCKET-NAME"
            ]
        }
    ]
}

4.3 Review and Create Policy

  1. Click Next
  2. Policy name: Enter a descriptive name (e.g., S3-With-HeadBucket-Policy)
  3. Description: Enter something like "Allows PutObject permission for specific S3 bucket"
  4. Click Create policy

Step 5: Attach Policy to User

5.1 Navigate Back to User

  1. Go back to IAMUsers
  2. Click on your created user (e.g., s3-upload-user)
  3. Click on the Permissions tab
  4. Click Add permissions

5.2 Attach the Custom Policy

  1. Select Attach policies directly
  2. In the search box, type the name of your custom policy (e.g., S3-With-HeadBucket-Policy)
  3. Check the box next to your policy
  4. Click Next
  5. Click Add permissions

After completing this setup, you now have the following essential information needed for your application or scripts:

📋 Configuration Details

ParameterValueExample
S3 Bucket Name[Your bucket name]my-app-uploads-2024
Bucket Region[Your selected region]us-east-1
AWS Access Key ID[Starts with AKIA...]AKIAIOSFODNN7EXAMPLE
AWS Secret Access Key[Long alphanumeric string]wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY


Troubleshooting

Stream logs contain comprehensive information about stream operation. In case of any problems please check the "reason" reported by stream in logs, it may guide you what part of stream configuration is incorrect.

Stream failure reasons can be due to:

  • Incorrect AWS S3 configuration - most probably the problem is in security policies for the bucket;
  • Device message does not contain expected parameters with media data;

Change log

To stay in sync with all improvements and changes to the upload device media to S3 stream, subscribe to the change log on the forum.


See also
Using Tachograph functionality
How to configure Tacho Bridge App to provide remote tachograph authentication via company cards