Module

@adminjs/upload

AdminJS feature allowing you to upload files to a given resource.

Features

  • Upload files by using different providers (all included):
    • AWS S3
    • Google Cloud Storage
    • Local file system
  • You can create Upload Provider and handle saving on your own with 3 methods
  • Uploading more than one file to one resource to different fields
  • Uploading multiple files to an array
  • Configuration options allowing you to define which fields should be persisted and their names in the database
  • Previews of uploaded files

Installation

To install the upload feature run:

yarn add @adminjs/upload

Storing data

The main concept of the upload plugin is that it sends uploaded files to an external source via the class called UploadProvider (we have 3 of them out of the box). And then it stores in the database path and folder name where the file was stored. Where:

  • key is the path of the stored file
  • bucket is the name of the folder.

Next, base on the expires option, the system generates either a public URL or a time-constrained URL.

Example: for

  • key: '927292/my-pinky-sweater.png'
  • bucket: 'aee-products' and
  • expires: 0

path for the file in AWS S3 will be https://aee-product.s3.amazonaws.com/927292/my-pinky-sweater.png and it will be always available (not time-constrained)

Usually, buckets are the same for all the files handled by the feature it is optional to store them in the database. But this might be handy if you want to change the bucket when the project grows and have a reference where the old files went.

To summarize: an important part is that we don't store the actual URL of the file - we store key and based on that we compute path on every request.

Usage

After that short introduction, let's go back to the feature itself.

As any feature you have to pass it to the resource in AdminJsOptions#resources property:

const AdminJS = require('adminjs')
const AdminJSExpress = require('@adminjs/express')
const uploadFeature = require('@adminjs/upload')

// part where you load adapter and models const User = require('./user')

const options = { resources: [{ resource: User, options: { listProperties: ['fileUrl', 'mimeType'], }, features: [uploadFeature({ provider: { aws: { region, bucket, secretAccessKey ... } }, properties: { key: 'fileUrl' // to this db field feature will safe S3 key mimeType: 'mimeType' // this property is important because allows to have previews }, validation: { mimeTypes: 'application/pdf' } })] }] }

const adminJs = new AdminJS(options) // and the rest of your app

Previews

Feature support previews for both audio and 8*images**. To make it work you have to have mimeType property mapped in the options.

Here we define that mime type will be save under a property mimeType:

const options = {
  resources: [{
    resource: User,
    options: { properties: { mimeType: { /** ... **/ } }},
    features: [uploadFeature({
      provider: {},
      properties: {
        key: 'fileUrl',
        mimeType: 'mimeType'
      },
    })]
  }]
}

Providers

Right now plugin supports both AWS S3 and Google Storage as a cloud hosting. Apart from that you can store files locally.

AWS setup

Make sure you have AWS-SDK installed

yarn add aws-sdk

To upload files to AWS S3, you have to

Then, fill all these data in AWSOptions and you are ready to go.

By default upload plugin generates a URL which is valid for 24h, if you want them to be always public (public-acl), you need to create a public bucket. Then set expires to 0.

Google Storage setup

Make sure you have Google Storage Node SDK installed:

yarn add @google-cloud/storage

and you are authenticated. follow this tutorial.

To upload files to AWS Google Storage, you have to follow all the instructions from: https:\\/\\/github.com/googleapis/nodejs-storage#before-you-begin

Then, fill the bucket you created in GCPOptions and you are ready to go.

By default upload plugin generates a URL which is valid for 24h, if you want them to be always public, you need to pass 0 as for expire parameter. Then set expires to 0.

Local Storage setup

Local storage will save files to the local folder.

There are 2 things you have to do before using this Provider.

1. create the*folder** (bucket) for the files (i.e. public)

cd your-app
mkdir public

2. tell your HTTP framework to host this folder

This is an example for the express server

app.use('/uploads', express.static('uploads'));

Next you have to add @adminjs/upload to given resource:

* const options = {
  resources: [{
    resource: User,
    features: [uploadFeature({
      provider: { local: { bucket: 'public' } },
    })]
  }]
}

Custom Provider

The plugin allows you also to pass your provider. In such a case, you have to pass to the provider option an instance of the class extended from BaseProvider.

const { BaseProvider } = require('@adminjs/upload')

class MyProvider extends BaseProvider { constructor() { // it requires bucket as a parameter to properly pass it to other methods super('public') } // your implementation goes here }

const provider = new MyProvider()

const options = { resources: [{ resource: User, features: [uploadFeature({ provider })] }] }

Options

This feature requires just one field in the database to store the path (Bucket key) of the uploaded file.

But it also can store more data like bucket, 'mimeType', 'size' etc. Fields mapping can be done in options.properties.

Mapping fields is a process of telling @adminjs/upload that data from the field on the left should go to the database under the field on the right.

So below key property will be stored under the mixed property uploadedFile in its sub-property key (mixed properties are JSONB properties in SQL databases or nested schemas in MongoDB).

Some properties are stored in the database, but some of them serve as the couriers in the request/response cycle. So for example, file property is used to send actual File objects from the Fronted to the Backend. Then, the File is uploaded and its key (and bucket) are stored. But the value of property file itself is not being saved in the database, meaning you don't have to have it in your DB schema.

Example setup for mapping properties

uploadFeature({
  provider: {},
  properties: {
    // virtual properties, created by this plugin on the go. They are not stored in the database
    // this is where frontend will send info to the backend
    file: `uploadedFile.file`,
    // here is where backend will send path to the file to the frontend [virtual property]
    filePath: `uploadedFile.file`,
    // here backend will send information which files has to be deleted
    // It is required only in `multiple` mode, but cannot overlap any other property
    filesToDelete: `uploadedFile.filesToDelete`,
// DB properties: have to be in your schema
// where bucket key will be stored
key: `uploadedFile.key`,
// where mime type will be stored
mimeType: `uploadedFile.mime`,
// where bucket name will be stored
bucket: `uploadedFile.bucket`,
// where size will be stored
size: `uploadedFile.size`,

}, })

In the example above we nest all the properties under uploadedFile, mixed property. This convention is a convenient way of storing multiple files in one record.

For the list of all options take a look at UploadOptions

Storing multiple files in one model by invoking uploadFeature more than once

You can pass an array of features to AdminJS so that it allows you to define uploads multiple times for one model. In other words you can have an avatar and familyPhoto in your User Resource.

In order to make that work you have to make sure that all the properties passed by each uploadFeature invocation are different so they don't steal data from each other.

So:

  • make sure to map at least file, filePath and filesToDelete properties to different values in each upload.
  • if you store other fields like mimeType they also should be stored under different paths.
  • define the UploadPathFunction for each upload so that files do not override each other.

Example:


features = [
  uploadFeature({
    provider: {},
    properties: {
      file: `avatarFile`,
      filePath: `avatarFilePath`,
      filesToDelete: `avatarFilesToDelete`,
      key: `avatarKey`,
      mimeType: `avatarMime`,
      bucket: `avatarBucket`,
      size: `avatarSize`,
    },
  }),
  uploadFeature({
  provider: {},
  properties: {
    file: `familyPhoto.file`,
    filePath: `familyPhoto.file`,
    filesToDelete: `familyPhoto.filesToDelete`,
    key: `familyPhoto.key`,
    mimeType: `familyPhoto.mime`,
    bucket: `familyPhoto.bucket`,
    size: `familyPhoto.size`,
  },
  uploadPath: (record, filename) => (
    `${record.id()}/family-photos/${filename}`
  ),
})]

In the example above, all the fields are stored under different paths so during the Frontend <-> Backend data transmission they don't overlap.

Storing multiple files in one model by using multiple option

The feature allows you to store multiple files as an array. To do this you have to:

  • set multiple option to true
  • make sure that all your mapped properties are arrays of strings.

If you use let say sequelize adapter you can set the type of the property to JSONB and, in adminJs options, define that this property is an array with PropertyOptions.isArray

Validation

The feature can validate both:

  • maximum size of the file
  • available mime types

Take a look at UploadOptions here as well.

Example models and addon configurations

Take a look at this database model working with google cloud for a reference:

Sequelize database with Google Cloud

Take a look at an example product upload schema:

export const ProductModel = sequelize.define('Products', {
  // Model attributes are defined here
  id: {
    primaryKey: true,
    type: DataTypes.UUID,
    defaultValue: UUIDV4,
  },
  name: {
    allowNull: false,
    type: DataTypes.STRING,
  },
  description: {
    type: DataTypes.TEXT,
    allowNull: true,
  },
  images: {
    type: DataTypes.JSONB,
    allowNull: true,
  },
  mainImage: {
    type: DataTypes.JSONB,
    allowNull: true,
  }
}, {
  // Other model options go here
})

It has 2 fields images and topImage. Let's define that images will have a multi-upload feature and topImage single-upload feature.

By setting them as JSONB type we ensure that the plugin will setup their sub-properties as regular strings (single-upload) or arrays (multi-upload).

To setup upload for 2 files we have to invoke uploadFeature twice as well:


const validation = {
  mimeTypes: ['image/jpeg', 'image/png'],
}

const features = [ uploadFileFeature({ properties: { file: 'images.file', filePath: 'images.path', filename: 'images.filename', filesToDelete: 'images.toDelete', key: 'images.key', mimeType: 'images.mimeType', bucket: 'images.bucket', }, multiple: true, provider: { gcp: { bucket: process.env.PRODUCTS_BUCKET as string, expires: 0, }, }, validation, }), uploadFeature({ properties: { file: 'topImage.file', filePath: 'topImage.path', filename: 'topImage.filename', filesToDelete: 'topImage.toDelete', key: 'topImage.key', mimeType: 'topImage.mimeType', bucket: 'topImage.bucket', }, provider: { gcp: { bucket: process.env.TOP_IMAGE_BUCKET as string, expires: 0, }, }, validation, }), ]

To see more examples, you can take a look at the example_app inside the repository.

View Source adminjs-upload/src/index.ts, line 2

Classes

BaseProvider

Type Definitions

object

# AWSOptions

AWS Credentials which can be set for S3 file upload.

If not given, 'aws-sdk' will try to fetch them from environmental variables.

Properties:
Name Type Attributes Description
accessKeyId string <optional>

AWS IAM accessKeyId. By default its value is taken from AWS_ACCESS_KEY_ID env variable

secretAccessKey string <optional>

AWS IAM secretAccessKey. By default its value is taken from AWS_SECRET_ACCESS_KEY env variable

region string

AWS region where your bucket was created.

bucket string

S3 Bucket where files will be stored

expires number <optional>

indicates how long links should be available after page load (in minutes)., Default to 24h. If set to 0 adapter will mark uploaded files as PUBLIC ACL.

View Source adminjs-upload/src/features/upload-file/providers/aws-provider.ts, line 56

object

# GCPOptions

Google Storage options which can be set for GCP file upload.

In order to setup GCP credentials you have to follow this tutorial. Basically it comes down to downloading service account and setting GOOGLE_APPLICATION_CREDENTIALS env variable. After that you are ready to go.

Properties:
Name Type Attributes Description
bucket string

Google Storage Bucket name, where files will be stored

expires number <optional>

indicates how long links should be available after page load (in minutes)., Default to 24h. If set to 0 adapter will mark uploaded files as public.

View Source adminjs-upload/src/features/upload-file/providers/gcp-provider.ts, line 54

object

# LocalUploadOptions

Options required by the LocalAdapter
Properties:
Name Type Description
bucket string

Path where files will be stored. For example: path.join(__dirname, '../public')

View Source adminjs-upload/src/features/upload-file/providers/local-provider.ts, line 37

object

# UploadOptions

Configuration options for @adminjs/upload feature
Properties:
Name Type Attributes Description
provider object | BaseProvider

Options for the provider

aws AWSOptions <optional>

AWS Credentials

gcp GCPOptions <optional>

GCP Credentials

local LocalUploadOptions <optional>

Storage on the local drive

properties object
key string

Property under which file key (path) will be stored

file string <optional>

Virtual property where uploaded file will be passed to from, frontend to the backend in the request payload. Default to file

filesToDelete string <optional>

Virtual property needed used when upload works in multiple mode. It contains all the keys, of the files which should be deleted. Default to filesToDelete

filePath string <optional>

Virtual property where path for uploaded file will be, generated and accessible on the frontend., Default to filePath

bucket string <optional>

Property under which file bucket (folder) will be stored

mimeType string <optional>

Property under which file mime type will be stored., When you give this system will show a correct icon by the, uploaded file

size string <optional>

Property under which file size will be stored

filename string <optional>

Property under which file name will be stored

uploadPath UploadPathFunction <optional>

Function which defines where the file should be placed inside the bucket., Default to ${record.id()}/${filename}.

multiple boolean <optional>

Indicates if feature should handle uploading multiple files

validation object <optional>

Validation rules

mimeTypes Array.<string> <optional>

Available mime types

maxSize number <optional>

Maximum size in bytes

View Source adminjs-upload/src/features/upload-file/types/upload-options.type.ts, line 9

# UploadPathFunction(record, filename)

Function which defines where in the bucket file should be stored.

If we have 2 uploads in one resource we might need to set them to

  • ${record.id()}/upload1/${filename}
  • ${record.id()}/upload2/${filename}

By default system uploads files to: ${record.id()}/${filename}

Parameters:
Name Type Description
record BaseRecord

Record for which file is uploaded

filename string

filename with extension

View Source adminjs-upload/src/features/upload-file/types/upload-options.type.ts, line 43