- MODDATAIMP-1109 Issue with mod-data-import module when configuration for AWS is set
- MODDATAIMP-1121 Remove token header for Eureka env
- MODDATAIMP-1054 Add error handling for script interrupts
- MODDATAIMP-1046 Fixing : "File upload already in progress" appears when data import job was run for more than one file.
- MODDATAIMP-1082 Improve error logging for AWS-related issue.
- MODDATAIMP-1085 Provide module permissions for subject types and sources
- MODDATAIMP-1088 Upgrade Spring from 5.3.23 to 6.1.13
- MODDATAIMP-1087 log4j-slf4j2-impl, log4j 2.24.0
- MODDATAIMP-1099 mod-data-import Ramsons 2024 R2 - RMB v35.3.x update
- MODDATAIMP-1083 Fix inconsistencies in permission namings
- MODDATAIMP-1020 Update load-marc-data-into-folio script with Poppy
- MODDATAIMP-1015 Upgrade data-import to RMB v35.2.0 and Vertx 4.5.4
- MODDATAIMP-1007 Make /data-import/uploadDefinitions/{id}/processFiles?defaultMapping=false as async
- MODDATAIMP-1003 Provide missing permissions to create Orders
- MODDATAIMP-886 Create Kafka topics instead of relying on auto create
- MODDATAIMP-969 Update folio-s3-client to v2.0.5
- MODDATAIMP-871 Upgrade folio-kafka-wrapper to 3.0.0 version
- MODDATAIMP-854 Upgrade mod-data-import to Java 17
- UXPROD-4337 Add S3 file upload/download support, file splitting for MARC 21, and smarter job prioritization
- Bumped
data-import
interface to3.1
- Bumped
- MODDATAIMP-898 Add system user to allow asynchronous processing
- UXPROD-4337 Add S3 file upload/download support, file splitting for MARC 21, and smarter job prioritization
- Bumped
data-import
interface to3.1
- Bumped
- MODDATAIMP-898 Add system user to allow asynchronous processing
- MODDATAIMP-786 Update data-import-util library to v1.11.0
- MODDATAIMP-727 Upgrade Vertx to v4.3.4. Fix "Producer closed while send in progress"
- MODDATAIMP-724 Logging improvement - Configuration
- MODDATAIMP-730 Upgrade dependency kafkaclients v3.2.3, folio-di-support v1.7.0, Spring v5.3
- FAT-3397 Put actual chunk id to kafka headers to correlate log messages for particular chunk
- MODDATAIMP-736 Adjust logging configuration for MOD-DI to display datetime in a proper format
- MODDATAIMP-641 Logging improvement
- MODDATAIMP-757 Add missed permissions for invoice data import flow
- MODDATAIMP-758 Improve logging (hide SQL requests)
- MODDATAIMP-768 Update permissions for links update
- MODDATAIMP-750 Update util dependencies
- MODDICORE-306 Upgrade data-import-processing-core dependency to v4.0.1
- MODDATAIMP-721 Upgrade RMB to v35.0.1
- MODDATAIMP-709 Supports users interface 15.0, 16.0
- MODDATAIMP-714 Assign each authority record to an Authority Source file list
- MODDATAIMP-472 EDIFACT files with txt file extensions do not import
- MODDATAIMP-646 Logs show incorrectly formatted request id
- MODDATAIMP-696 Upgrade RMB to v34.1.0
- Update RMB to v33.2.6
- MODDATAIMP-468 Update source-manager-job-executions interface 2.3 to 3.0
- MODDATAIMP-494 Improve logging
- MODDATAIMP-598 Log4j (CVE-2021-44228) vulnerability correction
- MODSOURMAN-550 Reduce BE response payload for DI Landing Page to increase performance
- MODDATAIMP-480 Suppress harmless errors from Data Import logs
- MODDATAIMP-548 Provide system properties to set records chunk size for each record type and marc format
- MODDATAIMP-511 Upgrade to RAML Module Builder 33.x
- MODDATAIMP-491 Improve logging to be able to trace the path of each record and file_chunks
- MODDATAIMP-465 Fix memory leaks - close Vertx Kafka producers
- MODDATAIMP-459 EDIFACT files with CAPS file extensions do not import
- MODDATAIMP-514 Add support for max.request.size configuration for Kafka messages
- MODDATAIMP-464 Change dataType to have have common type for MARC related subtypes
- Update data-import-processing-core dependency to v3.1.2
- MODDATAIMP-433 Store MARC Authority Records
- MODDATAIMP-451 Update interface version
- MODDATAIMP-413 When a file is uploaded for data import, the file extension check should be case-insensitive
- MODDATAIMP-388 Import job is not completed on file parsing error
- MODDATAIMP-400 Resolve data-import catching error issues
- MODDATAIMP-315 Use Kafka for data-import file processing
- MODDATAIMP-352 Implement source reader for EDIFACT files.
- MODDATAIMP-358 Upgrade to RAML Module Builder 32.x.
- MODDATAIMP-348 Add personal data disclosure form.
- MODDATAIMP-316 Disable CQL2PgJSON & CQLWrapper extra logging in mod-data-import
- MODDATAIMP-342 Upgrade to RMB v31.1.5
- Add batch-MARC-import script,
scripts/load-marc-data-into-folio.sh
- MODDATAIMP-324 Update all Data-Import modules to the new RMB version
- MODDATAIMP-338 Data-import job prevents all users from uploading a file and initiating another data-import job
- MODDATAIMP-325 Remove delimited reference values from file extensions data settings
- MODDATAIMP-300 Updated marc4j version to 2.9.1
- Updated reference to raml-storage
- MODDATAIMP-301 Upgrade to RMB 30.0.2
- MODDATAIMP-304 Lack of details in case of an error during file processing
- MODDATAIMP-296 Added migration script to support RMB version update
- Updated RMB version to 29.1.5
- Added defaultMapping query param
- Applied new JVM features to manage container memory
- Fixed security vulnerabilities
- Fixed encoding issues
- Added order of the record in importing file
- Added blocking coordination to process files in sequential manner
- Added total records counter for ChunkProcessing
- Updated schemas for support new RawRecords
- Filled in "fromModuleVersion" value for each "tables" section in schema.json
- Updated README with information about test mode of the module.
- Updated documentation on file processing api.
- Added support for incoming xml files containing MARC records.
- Added contentType field to the RawRecordsDto that describes type of records (MARC, EDIFACT etc) and format of record representation(JSON, XML, RAW).
- Fixed JobExecution status update error
- Added Spring DI support
- Added dependency on users interface
- Added support for incoming json files containing MARC records
- Fixed creating FileProcessor instance used in ProxyGen service
- Fixed updating JobProfile for jobs during the file processing
-
Created service for file chunking
-
Implemented MARC file reader for local files
-
Added CRUD for FileExtension entity
-
Changed logging configuration to slf4j
-
Optimized file upload functionality
-
Used shared data-import-utils
-
Renamed endpoints
METHOD URL DESCRIPTION POST /data-import/uploadDefinitions Create Upload Definition GET /data-import/uploadDefinitions Get list of Upload Definitions GET /data-import/uploadDefinitions/{uploadDefinitionId} Get Upload Definition by id PUT /data-import/uploadDefinitions/{uploadDefinitionId} Update Upload Definition DELETE /data-import/uploadDefinitions/{uploadDefinitionId} Delete Upload Definition POST /data-import/uploadDefinitions/{uploadDefinitionId}/files/{fileId} Upload file POST /data-import/uploadDefinitions/{uploadDefinitionId}/files Add file to Upload Definition DELETE /data-import/uploadDefinitions/{uploadDefinitionId}/files/{fileId} Delete file POST /data-import/uploadDefinitions/{uploadDefinitionId}/processFiles Start file processing GET /data-import/fileExtensions Get list of File Extensions POST /data-import/fileExtensions Create File Extension GET /data-import/fileExtensions/{id} Get File Extension by id PUT /data-import/fileExtensions/{id} Update File Extension DELETE /data-import/fileExtensions/{id} Delete File Extension POST /data-import/fileExtensions/restore/default Restore default File Extensions GET /data-import/dataTypes Get list of DataTypes
- Implemented functionality of file storing
- Implemented functionality of file upload