-
Notifications
You must be signed in to change notification settings - Fork 78
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Large file uploads (>4gb) not working in Safari; high memory usage for uploads in Safari #125
Comments
Hey, thanks for the issue! Not sure but sounds like there could be a big in this library. Could you share your upload code? |
Sure, here's the code relating to the upload. Let me know if there is anything else that could help. // components/Upload.tsx
const { files, resetFiles, uploadToS3 } = useS3Upload({
endpoint: '/api/appUpload',
});
const onUpload = async () => {
try {
await uploadToS3(selectedFile, {
endpoint: {
request: {
body: {
userId: user.id,
appId,
uploadId
},
headers: {},
},
},
});
} catch (error) {
// handle error
// This is where error.message is "[Error] NotReadableError: The I/O read operation failed." on large files in Safari
}
}
} // pages/api/appUpload.ts
import { NextApiRequest } from 'next';
import { getSession } from 'next-auth/react';
import { APIRoute } from 'next-s3-upload';
export const getS3AppBuildPath = async (req: NextApiRequest) => {
const { uploadId, userId, appId} = req.body;
if (!userId || !appId || !uploadId) {
throw new Error('Bad request');
}
const session = await getSession({ req });
if (!session) {
throw new Error('Not authenticated');
}
return `${appId}/${uploadId}/bundle.zip`;
};
export default APIRoute.configure({
accessKeyId: process.env.S3_UPLOAD_KEY,
secretAccessKey: process.env.S3_UPLOAD_SECRET,
bucket: process.env.S3_APP_UPLOAD_BUCKET,
region: process.env.S3_UPLOAD_REGION,
async key(req: NextApiRequest) {
return await getS3AppBuildPath(req);
},
}); |
Hmm, ok your code looks spot on. I'll try to test out with Safari and see if I can get you an answer. Sorry you're running into this issue. |
Looks like this is a bug in lib-storage. We use lib-storage under the hood to do the upload.
In the first thread someone had an solution using patch package. Pretty ugly :( I'll try to reproduce and post something in those threads. |
Also experiencing this so bumping |
Could maybe get away with using MultiPart Upload if the file size is over N, and the browser is Safari?
|
My team really appreciates this library, and overall it's working out great for us. The one issue we are encountering is uploading large zip files with Safari on macOS (> 4gb). All other major browsers seem to work just fine, Safari gives this error as soon as you start the upload of a large file with the
uploadToS3
function:Uploads from Safari (< 4gb) also seem to use a lot of system memory during the upload compared to Chrome. On Safari, the system memory usage spikes, the fans go wild (MBP pre M1 model) and I get this warning message in the browser: "This webpage is using significant memory. Closing it may improve the responsiveness of your Mac."
Any ideas on what the the issue could be here? Should files larger than 4gb work in Safari? Let me know if there is any other information I can provide to help diagnose the issue if it's not reproducible in other projects.
Thank you!
The text was updated successfully, but these errors were encountered: