Skip to content

Commit

Permalink
[MSEARCH-786] rename bibframe to linked-linked data for consistency
Browse files Browse the repository at this point in the history
  • Loading branch information
Aleksei Pronichev committed Jul 3, 2024
1 parent fe3b144 commit 66a06af
Show file tree
Hide file tree
Showing 46 changed files with 358 additions and 343 deletions.
4 changes: 2 additions & 2 deletions NEWS.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,8 +18,8 @@
* Return Unified List of Inventory Locations in a Consortium ([MSEARCH-681](https://folio-org.atlassian.net/browse/MSEARCH-681))
* Remove ability to match on LCCN searches without a prefix ([MSEARCH-752](https://folio-org.atlassian.net/browse/MSEARCH-752))
* Search consolidated items/holdings data in consortium ([MSEARCH-759](https://folio-org.atlassian.net/browse/MSEARCH-759))
* Create bibframe index and process bibframe events ([MSEARCH-781](https://folio-org.atlassian.net/browse/MSEARCH-781))
* Create bibframe authority index and process bibframe authority events ([MSEARCH-784](https://folio-org.atlassian.net/browse/MSEARCH-784))
* Create linked data work index and process linked data work events ([MSEARCH-781](https://folio-org.atlassian.net/browse/MSEARCH-781))
* Create linked data authority index and process linked data authority events ([MSEARCH-784](https://folio-org.atlassian.net/browse/MSEARCH-784))
* Allow Unified List of Inventory Locations in a Consortium to be fetched by member tenants ([MSEARCH-660](https://folio-org.atlassian.net/browse/MSEARCH-660))
* Implement Indexing of Campuses from Kafka ([MSEARCH-770](https://issues.folio.org/browse/MSEARCH-770))
* Extend response with additional Location fields for Inventory Locations in a Consortium endpoint ([MSEARCH-775](https://folio-org.atlassian.net/browse/MSEARCH-775))
Expand Down
20 changes: 10 additions & 10 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -244,7 +244,7 @@ and [Cross-cluster replication](https://docs.aws.amazon.com/opensearch-service/l
| KAFKA_CONTRIBUTORS_TOPIC_REPLICATION_FACTOR | - | Replication factor for `search.instance-contributor` topic. |
| KAFKA_CONSORTIUM_INSTANCE_CONCURRENCY | 2 | Custom number of kafka concurrent threads for consortium.instance message consuming. |
| KAFKA_LOCATION_CONCURRENCY | 1 | Custom number of kafka concurrent threads for inventory.location, inventory.campus, inventory.institution and inventory.library message consuming. |
| KAFKA_BIBFRAME_CONCURRENCY | 1 | Custom number of kafka concurrent threads for bibframe message consuming. |
| KAFKA_LINKED_DATA_CONCURRENCY | 1 | Custom number of kafka concurrent threads for linked data message consuming. |
| KAFKA_CONSORTIUM_INSTANCE_TOPIC_PARTITIONS | 50 | Amount of partitions for `search.consortium.instance` topic. |
| KAFKA_CONSORTIUM_INSTANCE_TOPIC_REPLICATION_FACTOR | - | Replication factor for `search.consortium.instance` topic. |
| KAFKA_SUBJECTS_CONCURRENCY | 2 | Custom number of kafka concurrent threads for subject message consuming. |
Expand Down Expand Up @@ -415,15 +415,15 @@ Consortium feature on module enable is defined by 'centralTenantId' tenant param

### Search API

| METHOD | URL | DESCRIPTION |
|:-------|:------------------------------------------|:-------------------------------------------------------------------------------------|
| GET | `/search/instances` | Search by instances and to this instance items and holding-records |
| GET | `/search/authorities` | Search by authority records |
| GET | `/search/bibframe` | Search linked data graph resource descriptions |
| GET | `/search/bibframe/authorities` | Search linked data graph authority resource descriptions |
| GET | `/search/{recordType}/facets` | Get facets where recordType could be: instances, authorities, contributors, subjects |
| GET | ~~`/search/instances/ids`~~ | (DEPRECATED) Stream instance ids as JSON or plain text |
| GET | ~~`/search/holdings/ids`~~ | (DEPRECATED) Stream holding record ids as JSON or plain text |
| METHOD | URL | DESCRIPTION |
|:-------|:----------------------------------|:-------------------------------------------------------------------------------------|
| GET | `/search/instances` | Search by instances and to this instance items and holding-records |
| GET | `/search/authorities` | Search by authority records |
| GET | `/search/linked-data/works` | Search linked data graph work resource descriptions |
| GET | `/search/linked-data/authorities` | Search linked data graph authority resource descriptions |
| GET | `/search/{recordType}/facets` | Get facets where recordType could be: instances, authorities, contributors, subjects |
| GET | ~~`/search/instances/ids`~~ | (DEPRECATED) Stream instance ids as JSON or plain text |
| GET | ~~`/search/holdings/ids`~~ | (DEPRECATED) Stream holding record ids as JSON or plain text |

#### Searching and filtering

Expand Down
19 changes: 12 additions & 7 deletions descriptors/ModuleDescriptor-template.json
Original file line number Diff line number Diff line change
Expand Up @@ -106,18 +106,18 @@
"methods": [
"GET"
],
"pathPattern": "/search/bibframe",
"pathPattern": "/search/linked-data/works",
"permissionsRequired": [
"search.bibframe.collection.get"
"search.linked-data.work.collection.get"
]
},
{
"methods": [
"GET"
],
"pathPattern": "/search/bibframe/authorities",
"pathPattern": "/search/linked-data/authorities",
"permissionsRequired": [
"search.bibframe.authority.collection.get"
"search.linked-data.authority.collection.get"
]
},
{
Expand Down Expand Up @@ -616,9 +616,14 @@
"description": "Searches authorities by given query"
},
{
"permissionName": "search.bibframe.collection.get",
"displayName": "Search - searches bibframe by given query",
"description": "Searches bibframe by given query"
"permissionName": "search.linked-data.work.collection.get",
"displayName": "Search - searches linked data works by given query",
"description": "Searches linked data works by given query"
},
{
"permissionName": "search.linked-data.authority.collection.get",
"displayName": "Search - searches linked data authorities by given query",
"description": "Searches linked data authorities by given query"
},
{
"permissionName": "browse.call-numbers.instances.collection.get",
Expand Down
30 changes: 16 additions & 14 deletions src/main/java/org/folio/search/controller/SearchController.java
Original file line number Diff line number Diff line change
Expand Up @@ -3,12 +3,12 @@
import lombok.RequiredArgsConstructor;
import org.folio.search.domain.dto.Authority;
import org.folio.search.domain.dto.AuthoritySearchResult;
import org.folio.search.domain.dto.Bibframe;
import org.folio.search.domain.dto.BibframeAuthority;
import org.folio.search.domain.dto.BibframeSearchAuthorityResult;
import org.folio.search.domain.dto.BibframeSearchResult;
import org.folio.search.domain.dto.Instance;
import org.folio.search.domain.dto.InstanceSearchResult;
import org.folio.search.domain.dto.LinkedDataAuthority;
import org.folio.search.domain.dto.LinkedDataAuthoritySearchResult;
import org.folio.search.domain.dto.LinkedDataWork;
import org.folio.search.domain.dto.LinkedDataWorkSearchResult;
import org.folio.search.model.service.CqlSearchRequest;
import org.folio.search.rest.resource.SearchApi;
import org.folio.search.service.SearchService;
Expand Down Expand Up @@ -53,12 +53,14 @@ public ResponseEntity<InstanceSearchResult> searchInstances(String tenantId, Str
}

@Override
public ResponseEntity<BibframeSearchResult> searchBibframe(String tenant, String query, Integer limit,
Integer offset) {
public ResponseEntity<LinkedDataWorkSearchResult> searchLinkedDataWorks(String tenantId,
String query,
Integer limit,
Integer offset) {
var searchRequest = CqlSearchRequest.of(
Bibframe.class, tenant, query, limit, offset, true);
LinkedDataWork.class, tenantId, query, limit, offset, true);
var result = searchService.search(searchRequest);
return ResponseEntity.ok(new BibframeSearchResult()
return ResponseEntity.ok(new LinkedDataWorkSearchResult()
.searchQuery(query)
.content(result.getRecords())
.pageNumber(divPlusOneIfRemainder(offset, limit))
Expand All @@ -68,14 +70,14 @@ public ResponseEntity<BibframeSearchResult> searchBibframe(String tenant, String
}

@Override
public ResponseEntity<BibframeSearchAuthorityResult> searchBibframeAuthorities(String tenant,
String query,
Integer limit,
Integer offset) {
public ResponseEntity<LinkedDataAuthoritySearchResult> searchLinkedDataAuthorities(String tenantId,
String query,
Integer limit,
Integer offset) {
var searchRequest = CqlSearchRequest.of(
BibframeAuthority.class, tenant, query, limit, offset, true);
LinkedDataAuthority.class, tenantId, query, limit, offset, true);
var result = searchService.search(searchRequest);
return ResponseEntity.ok(new BibframeSearchAuthorityResult()
return ResponseEntity.ok(new LinkedDataAuthoritySearchResult()
.searchQuery(query)
.content(result.getRecords())
.pageNumber(divPlusOneIfRemainder(offset, limit))
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -196,16 +196,16 @@ public void handleLocationEvents(List<ConsumerRecord<String, ResourceEvent>> con
}

@KafkaListener(
id = KafkaConstants.BIBFRAME_LISTENER_ID,
id = KafkaConstants.LINKED_DATA_LISTENER_ID,
containerFactory = "standardListenerContainerFactory",
groupId = "#{folioKafkaProperties.listener['bibframe'].groupId}",
concurrency = "#{folioKafkaProperties.listener['bibframe'].concurrency}",
topicPattern = "#{folioKafkaProperties.listener['bibframe'].topicPattern}")
public void handleBibframeEvents(List<ConsumerRecord<String, ResourceEvent>> consumerRecords) {
log.info("Processing bibframe events from Kafka [number of events: {}]", consumerRecords.size());
groupId = "#{folioKafkaProperties.listener['linked-data'].groupId}",
concurrency = "#{folioKafkaProperties.listener['linked-data'].concurrency}",
topicPattern = "#{folioKafkaProperties.listener['linked-data'].topicPattern}")
public void handleLinkedDataEvents(List<ConsumerRecord<String, ResourceEvent>> consumerRecords) {
log.info("Processing linked data events from Kafka [number of events: {}]", consumerRecords.size());
var batch = consumerRecords.stream()
.map(ConsumerRecord::value)
.map(bibframe -> bibframe.id(getResourceEventId(bibframe)))
.map(ld -> ld.id(getResourceEventId(ld)))
.toList();

indexResources(batch, resourceService::indexResources);
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -26,8 +26,8 @@ public class ResourceEventBatchInterceptor implements BatchInterceptor<String, R
Map.entry("inventory.campus", SearchUtils.CAMPUS_RESOURCE),
Map.entry("inventory.institution", SearchUtils.INSTITUTION_RESOURCE),
Map.entry("inventory.library", SearchUtils.LIBRARY_RESOURCE),
Map.entry("search.bibframe", SearchUtils.BIBFRAME_RESOURCE),
Map.entry("search.bibframe-authorities", SearchUtils.BIBFRAME_AUTHORITY_RESOURCE)
Map.entry("linked-data.work", SearchUtils.LINKED_DATA_WORK_RESOURCE),
Map.entry("linked-data.authority", SearchUtils.LINKED_DATA_AUTHORITY_RESOURCE)
);

@Override
Expand Down

This file was deleted.

This file was deleted.

This file was deleted.

Original file line number Diff line number Diff line change
@@ -1,34 +1,34 @@
package org.folio.search.service.setter.bibframe.authority;
package org.folio.search.service.setter.linkeddata.authority;

import static java.util.stream.Collectors.toCollection;
import static org.folio.search.domain.dto.BibframeAuthorityIdentifiersInner.TypeEnum.LCCN;
import static org.folio.search.domain.dto.LinkedDataAuthorityIdentifiersInner.TypeEnum.LCCN;

import java.util.Collections;
import java.util.LinkedHashSet;
import java.util.Objects;
import java.util.Optional;
import java.util.Set;
import lombok.RequiredArgsConstructor;
import org.folio.search.domain.dto.BibframeAuthority;
import org.folio.search.domain.dto.BibframeAuthorityIdentifiersInner;
import org.folio.search.domain.dto.LinkedDataAuthority;
import org.folio.search.domain.dto.LinkedDataAuthorityIdentifiersInner;
import org.folio.search.service.lccn.LccnNormalizer;
import org.folio.search.service.setter.FieldProcessor;
import org.springframework.stereotype.Component;

@Component
@RequiredArgsConstructor
public class BibframeAuthorityLccnProcessor implements FieldProcessor<BibframeAuthority, Set<String>> {
public class LinkedDataAuthorityLccnProcessor implements FieldProcessor<LinkedDataAuthority, Set<String>> {

private final LccnNormalizer lccnNormalizer;

@Override
public Set<String> getFieldValue(BibframeAuthority bibframe) {
return Optional.of(bibframe)
.map(BibframeAuthority::getIdentifiers)
public Set<String> getFieldValue(LinkedDataAuthority linkedDataAuthority) {
return Optional.of(linkedDataAuthority)
.map(LinkedDataAuthority::getIdentifiers)
.orElseGet(Collections::emptyList)
.stream()
.filter(i -> LCCN.equals(i.getType()))
.map(BibframeAuthorityIdentifiersInner::getValue)
.map(LinkedDataAuthorityIdentifiersInner::getValue)
.filter(Objects::nonNull)
.map(lccnNormalizer)
.flatMap(Optional::stream)
Expand Down
Loading

0 comments on commit 66a06af

Please sign in to comment.