Skip to content

Commit

Permalink
add the file_actor_decorator with the s3 upload code in it... 1GB thr…
Browse files Browse the repository at this point in the history
…eshold
  • Loading branch information
cziaarm committed Jul 2, 2024
1 parent cd05333 commit f147023
Showing 1 changed file with 37 additions and 0 deletions.
37 changes: 37 additions & 0 deletions app/actors/hyrax/actors/file_actor_decorator.rb
Original file line number Diff line number Diff line change
@@ -0,0 +1,37 @@
module Hyrax
module Actors
# Actions for a file identified by file_set and relation (maps to use predicate)
# @note Spawns asynchronous jobs
module FileActorDecorator

def perform_ingest_file_through_active_fedora(io)
# Skip versioning because versions will be minted by VersionCommitter as necessary during save_characterize_and_record_committer.
# these are files too big to send to S3 w/o Streaming
Rails.logger.error("[FileActor] starting write for #{file_set.id}")
if io.size.to_i >= 1.gigabytes
Rails.logger.error("[FileActor] Uploading directly to S3 for file_set #{file_set.id}")
digest = `sha1sum #{io.path}`.split.first
file_set.s3_only = digest
s3_object = Aws::S3::Object.new(ENV['AWS_BUCKET'], digest)
s3_object.upload_file(io.path) unless s3_object.exists?
Hydra::Works::AddExternalFileToFileSet.call(file_set, s3_object.public_url, relation)
# how do we make sure the sha gets indexed?
else
Rails.logger.error("[FileActor] writing to fcrepo #{file_set.id}")
# Skip versioning because versions will be minted by VersionCommitter as necessary during save_characterize_and_record_committer.
Hydra::Works::AddFileToFileSet.call(file_set,
io,
relation,
versioning: false)
end
return false unless file_set.save
repository_file = related_file
create_version(repository_file, user)
CharacterizeJob.perform_later(file_set, repository_file.id, pathhint(io))
end

end
end
end

Hyrax::Actors::FileActor.prepend(Hyrax::Actors::FileActorDecorator)

0 comments on commit f147023

Please sign in to comment.