You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Issues with fileAdapter when integrating Amazon S3
I decided to integrate Amazon AWS S3 into my project using the open-condo codebase, which already includes scripts for AWS S3. In theory, these should work with Amazon S3 as well. However, I noticed that the S3 logic is named SberCloudFileAdapter. Wouldn’t it be better to rename it to something more generic, like S3CloudFileAdapter?
I created an Amazon S3 bucket, granted the necessary permissions, and set the following environment variable in my .env file:
This allowed me to upload empty text files, but any file containing data triggered the following error:
A header you provided implies functionality that is not implemented
A similar issue is described in [this Stack Overflow post]. It seems related to how the stream is handled in packages/keystone/fileAdapter/sberCloudFileAdapter.js:
After that, a new error appeared when uploading documents via the UI:
"extensions": {
"code": "INTERNAL_SERVER_ERROR",
"messageForDeveloper": "The \"path\" argument must be of type string or an instance of Buffer or URL. Received undefined\n\nGraphQL request:2:3\n1 | mutation createDocuments($data: [DocumentsCreateInput]) {\n2 | objs: createDocuments(data: $data) {\n | ^\n3 | organization {"
}
However, using the following script, I can successfully upload files:
require('dotenv').config()constfs=require('fs')constpath=require('path')const{v4: uuidv4}=require('uuid')constFileAdapter=require('./fileAdapter')constAdapter=newFileAdapter('test',false)constsaveTestImage=async()=>{try{constfilePath=path.join(__dirname,'sample.jpg')if(!fs.existsSync(filePath)){thrownewError(`No file: ${filePath}`)}conststream=fs.createReadStream(filePath)constid=uuidv4()constfilename=path.basename(filePath)constmimetype='image/jpeg'constencoding=undefinedconstmeta={description: 'Test'}console.log(' filename, id, mimetype, encoding, meta ',filename,id,mimetype,encoding,meta)constresult=awaitAdapter.save({ stream, filename, id, mimetype, encoding, meta })console.log('Done',result)}catch(error){console.error('Error',error)}}saveTestImage()
I also tried converting the stream to a buffer, but in that case the uploaded file has zero size.
Is there a recommended approach or any additional steps I should take to properly configure AWS S3 integration?
Alternative Test Script with AWS SDK
For testing, I tried another approach using the official AWS SDK:
// Import the AWS SDK library.constAWS=require('aws-sdk')// Configure AWS SDK with credentials and region.AWS.config.update({accessKeyId: '<YOUR_ACCESS_KEY_ID>',secretAccessKey: '<YOUR_SECRET_ACCESS_KEY>',region: 'eu-central-1',// Replace with your S3 bucket region.})// Create an S3 client instance.consts3=newAWS.S3()asyncfunctionputObject(){try{constparams={Bucket: 'bucket name',// Specify the bucket name.Key: 'example/objectname',// Specify the object key.Body: 'Hello S3',// Specify the object content.}// Upload the object.constresult=awaits3.putObject(params).promise()console.log('Put object(%s) under the bucket(%s) successful!!',params.Key,params.Bucket)console.log('ETag: %s',result.ETag)}catch(error){console.error('An error occurred while putting the object to S3:',error)}}putObject()
In addition, I set the following permissions in AWS:
Issues with
fileAdapter
when integrating Amazon S3I decided to integrate Amazon AWS S3 into my project using the
open-condo
codebase, which already includes scripts for AWS S3. In theory, these should work with Amazon S3 as well. However, I noticed that the S3 logic is namedSberCloudFileAdapter
. Wouldn’t it be better to rename it to something more generic, likeS3CloudFileAdapter
?I created an Amazon S3 bucket, granted the necessary permissions, and set the following environment variable in my
.env
file:After launching the project, I encountered the following error:
According to [this Stack Overflow post](https://stackoverflow.com/questions/26533245/the-authorization-mechanism-you-have-provided-is-not-supported-please-use-aws4), you need to specify a signature version, but it seems like the
esdk-obs-nodejs
library doesn’t support it.I managed to work around the issue by looking at the [official example](https://support.huaweicloud.com/intl/en-us/sdk-nodejs-devg-obs/obs_29_0109.html), which explains that you can provide a
security_token
. To make it work, I renamedsecret_access_key
tosecurity_token
and commented out the required parameter inpackages/keystone/fileAdapter/fileAdapter.js
:This allowed me to upload empty text files, but any file containing data triggered the following error:
A similar issue is described in [this Stack Overflow post]. It seems related to how the stream is handled in
packages/keystone/fileAdapter/sberCloudFileAdapter.js
:I attempted to fix it by modifying the code as follows:
After that, a new error appeared when uploading documents via the UI:
However, using the following script, I can successfully upload files:
I also tried converting the stream to a buffer, but in that case the uploaded file has zero size.
Is there a recommended approach or any additional steps I should take to properly configure AWS S3 integration?
Alternative Test Script with AWS SDK
For testing, I tried another approach using the official AWS SDK:
In addition, I set the following permissions in AWS:
Cross-origin resource sharing (CORS):
Bucket policy:
Are there any additional steps or configurations required to properly connect and use Amazon AWS S3?
The text was updated successfully, but these errors were encountered: