- Support ECS task role with my-import job class
- [fix] --log-dir, --log-path and --s3-log options are wrongly not ommittable
- [fix] --enable-queue did not work
- [new] New config file config/bricolage.yml to save command line options in the file.
- [new] New option --s3-log to upload log files to S3.
- [fix] Strips ".sql" from job ID, when the job is executed via *.sql.job file.
- [fix] Strips all file extensions from jobnet ID, including ".job" or ".sql.job".
- [fix] mys3dump creates empty object even though if source table has no records.
- [new] new job class: adhoc. This job class have only one parameter, sql-file, so instance jobs are never affected by defaults value such as analyze or grant.
- [new] streaming_load: new option --skip-work
- [CHANGE] Drops TD data source support from core. Use separated bricolage-td gem.
- [new] load, insert: Reduces the number of transactions.
- [new] new option -Q, -L
- [new] bricolage-jobnet: new options --enable-queue and --local-state-dir, for auto-named job queue.
- [new] bricolage: new option --log-path.
- [new] bricolage, bricolage-jobnet: new option --log-dir.
- [new] bricolage, bricolage-jobnet: new env BRICOLAGE_LOG_PATH.
- [new] bricolage, bricolage-jobnet: new env BRICOLAGE_LOG_DIR.
- [new] bricolage-jobnet command accepts .job file as a single job jobnet.
- [fix] my-migrate, my-import: Do not exposure passwords in command line arguments or log files.
- [fix] my-migrate, my-import: should not drop old tables in the RENAME transaction, to avoid "table dropped by concurrent transaction" error.
- [new] mysql data source: new option "collation".
- [fix] AWS S3 API ListObjectsV2 may return corrupted XML, retry it
- [new] new job class my-import-delta.
- [new] streaming_load: Reduces the number of transaction.
- [new] streaming_load: new option --ctl-prefix and --keep-ctl (both is optional).
- [new] bricolage, bricolage-jobnet, Bricolage::CommandLineApplication now do not block on executing queries in PostgreSQL-like DBs (including Redshift).
- [CHANGE] Removes (maybe) unused method PostgresConnection#streaming_execute_query. Use #query_batch instead.
- new class SNSDataSource.
- new class NullLogger.
- new exception S3Exception.
- new exception SNSException.
- [new] New parameter "no-backup" for my-import and my-migrate job classes.
- [new] New parameter "sql_log_level" for the psql data source.
- [new] Shows SQL source location before the query.
- Raises ConnectionError for all connection-level problems, while it raises SQLError for SQL-level errors.
- [fix] Using CommandLineApplication with --environment option causes unexpected option error
- [fix] --dry-run option did not work for my-import job class.
- [new] AWS access key id & secret key are now optional for S3 data sources (to allow using EC2 instance or ECS task attached IAM role)
- [new] Supports Redshift attached IAM role for COPY and UNLOAD.
- code-level change only: [new] new method Transaction#truncate_and_commit
- code-level change only
- [fix] require 'bricolage/context' wrongly caused NameError.
- [new] PostgresConnection#drop_table_force utilizes DROP TABLE IF EXISTS.
- rebuild-rename, rebuild-drop, my-import, my-migrate, create, createview: Reduces the number of transactions for speed.
- [fix] my-import: mys3dump: Fixes buffer size problem.
- [fix] my-import: mys3dump: Escapes more meta characters (e.g. \n, \r, \v, ...).
- [fix] Adds dependency to rake
- [fix] my-import: Reduces warning log messages.
- [fix] streaming_load: Disables statupdate for log staging table, it is useless.
- [fix] streaming_load: Disables compupdate on COPY. This might cause Assert error on some clusters.
- [fix] Fixes syntax error on ruby 2.1
- [CHANGE][EXPERIMENTAL] streaming_load: Always reuse same temporary log table xxxx_l_wk instead of temporary xxxx_l_tmpNNNN. This might cause Redshift DDL slow down, I try to reduce the number of drop-create.
- [fix] redis-export: remove un-required error check.
- [new] redis-export: make faster using cursor and Redis pipeline.
- [new][EXPERIMENTAL] new job class redis-export.
- [new] streaming_load: Fast log check by temporary load log table.
- [new] streaming_load: Ignores all S3 key-does-not-exist errors; they are caused by S3 eventual consistency.
- [fix] load, streaming_load: "encrypted" load option should not be used for SSE-KMS
- [new] streaming_load: Supports S3 server-side encryption with AWS KMS (Key Management Service).
- Now Bricolage requires Ruby AWS-SDK v2 for AWS signature v4.
- [fix] Ruby 2.1 does not have Etc.uname, use uname command instead.
- [new] Supports loading from encrypted S3 data source.
- [new] New job class "createview".
- [new] Now "create" and "sql" job class support "grant" parameter.
- [new] my-migrate job class supports sql-file parameter for export.
- [new] td-export job class supports .sql.job file
- [fix] ensure unlocking VACUUM lock, also when VACUUM statement was failed.
- [new] Introduces subsystem-wise variable file (SUBSYS/variable.yml)
- [new] Allows providing default options by "defaults" global variable (e.g. enabling "grant" option by default)
- [fix] Supports jobnet which has both a job and a jobnet
- streaming_load: new option --sql-file