Skip to content

Commit

Permalink
release-2.1.0 (#3)
Browse files Browse the repository at this point in the history
* Update project to latest tag parer-kettle-2.1.0

* Update pom.xml

* Update SNAPSHOT version

* [maven-release-plugin] prepare release parer-kettle-2.1.0

* [maven-release-plugin] prepare for next development iteration

---------

Co-authored-by: parerworker <[email protected]>
Co-authored-by: Stefano Sinatti <[email protected]>
Co-authored-by: GitHub Actions <[email protected]>
  • Loading branch information
4 people authored Nov 19, 2024
1 parent 6601ecc commit a6411db
Show file tree
Hide file tree
Showing 16 changed files with 154 additions and 89 deletions.
12 changes: 12 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,16 @@

## 2.1.0 (11-11-2024)

### Bugfix: 2
- [#34198](https://parermine.regione.emilia-romagna.it/issues/34198) Lista di risultati troncati dalla chiamata allo storico delle trasformazioni
- [#34064](https://parermine.regione.emilia-romagna.it/issues/34064) Gestione del carattere + nel nome dell'oggetto

### Novità: 1
- [#34451](https://parermine.regione.emilia-romagna.it/issues/34451) Rimuovere il parametro XF_KETTLE_DB_PASSWORD dal report di trasformazione

### SUE: 1
- [#34063](https://parermine.regione.emilia-romagna.it/issues/34063) Modifica all'url di AWS nella configurazione di kettle server (tutti gli ambienti)

## 2.0.0 (24-04-2024)

### Bugfix: 1
Expand Down
6 changes: 3 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -31,13 +31,13 @@ I servizi REST esposti sono i seguenti:

I servizi SOAP esposti sono i seguenti:

- **esistenzaCartella** : controlla l'esistenza di una determninata cartella nel repository di kettle.
- **esistenzaCartella** : controlla l'esistenza di una determinata cartella nel repository di kettle.
- **inserisciTransformation** : aggiunge una nuova trasformazione nel repository kettle.
- **statusCodaTrasformazione** : resituisce un immagine delle trasformazione in corso, in coda o eseguite.
- **inserisciJob** : aggiungi un nuovo job nel repository kettle.
- **inserisciJob** : aggiunge un nuovo job nel repository kettle.
- **eseguiTrasformazione** : esegue una trasformazione presente nel repository.
- **inserisciCartella** : crea una nuova cartella nel repository di kettle.
- **ottieniParametri** : ottiene una lista dei parametri di una determninata trasformazione.
- **ottieniParametri** : ottiene una lista dei parametri di una determinata trasformazione.
- **eliminaCartella** : elimina una cartella nel repository di kettle.

[Qui](src/docs/kettleserver.wsdl) il wsdl dell'endpoint SOAP.
Expand Down
16 changes: 9 additions & 7 deletions RELEASE-NOTES.md
Original file line number Diff line number Diff line change
@@ -1,9 +1,11 @@
## 2.0.0 (24-04-2024)
## 2.1.0 (11-11-2024)

### Bugfix: 1
- [#27975](https://parermine.regione.emilia-romagna.it/issues/27975) Correzione della gestione dell'avvio di kettle senza la presenza di object storage
### Bugfix: 2
- [#34198](https://parermine.regione.emilia-romagna.it/issues/34198) Lista di risultati troncati dalla chiamata allo storico delle trasformazioni
- [#34064](https://parermine.regione.emilia-romagna.it/issues/34064) Gestione del carattere + nel nome dell'oggetto

### Novità: 3
- [#30870](https://parermine.regione.emilia-romagna.it/issues/30870) Migrazione plugin Kettle alla versione 9.4
- [#29136](https://parermine.regione.emilia-romagna.it/issues/29136) Migrazione Kettle alla versione 9.4
- [#25567](https://parermine.regione.emilia-romagna.it/issues/25567) Gestione esterna al pacchetto di deploy per le configurazioni di Kettle
### Novità: 1
- [#34451](https://parermine.regione.emilia-romagna.it/issues/34451) Rimuovere il parametro XF_KETTLE_DB_PASSWORD dal report di trasformazione

### SUE: 1
- [#34063](https://parermine.regione.emilia-romagna.it/issues/34063) Modifica all'url di AWS nella configurazione di kettle server (tutti gli ambienti)
2 changes: 1 addition & 1 deletion parer-kettle-jpa/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@
<parent>
<groupId>it.eng.parer</groupId>
<artifactId>parer-kettle</artifactId>
<version>2.0.1-SNAPSHOT</version>
<version>2.1.1-SNAPSHOT</version>
</parent>
<artifactId>parer-kettle-jpa</artifactId>
<name>Parer Kettle Persistence</name>
Expand Down
2 changes: 1 addition & 1 deletion parer-kettle-model/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@
<parent>
<groupId>it.eng.parer</groupId>
<artifactId>parer-kettle</artifactId>
<version>2.0.1-SNAPSHOT</version>
<version>2.1.1-SNAPSHOT</version>
</parent>
<artifactId>parer-kettle-model</artifactId>
<name>Parer Kettle Model</name>
Expand Down
2 changes: 1 addition & 1 deletion parer-kettle-rest-client/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@
<parent>
<groupId>it.eng.parer</groupId>
<artifactId>parer-kettle</artifactId>
<version>2.0.1-SNAPSHOT</version>
<version>2.1.1-SNAPSHOT</version>
</parent>
<artifactId>parer-kettle-rest-client</artifactId>
<name>Parer Kettle REST Client</name>
Expand Down
2 changes: 1 addition & 1 deletion parer-kettle-rest/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@
<parent>
<groupId>it.eng.parer</groupId>
<artifactId>parer-kettle</artifactId>
<version>2.0.1-SNAPSHOT</version>
<version>2.1.1-SNAPSHOT</version>
</parent>
<artifactId>parer-kettle-rest</artifactId>
<name>Parer Kettle REST Service</name>
Expand Down
13 changes: 11 additions & 2 deletions parer-kettle-server/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
<parent>
<groupId>it.eng.parer</groupId>
<artifactId>parer-kettle</artifactId>
<version>2.0.1-SNAPSHOT</version>
<version>2.1.1-SNAPSHOT</version>
</parent>

<artifactId>kettle-server</artifactId>
Expand Down Expand Up @@ -137,10 +137,19 @@
</dependency>

<!-- Amazon S3 -->
<dependency>
<!-- <dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk-s3</artifactId>
</dependency>-->
<!-- Amazon S3 -->
<dependency>
<groupId>software.amazon.awssdk</groupId>
<artifactId>s3</artifactId>
</dependency>
<dependency>
<groupId>software.amazon.awssdk</groupId>
<artifactId>apache-client</artifactId>
</dependency>
<dependency>
<groupId>joda-time</groupId>
<artifactId>joda-time</artifactId>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -64,7 +64,8 @@ List<MonExecTrasf> findByNmKsInstanceAndTiStatoTrasf(String nmKsInstance,
public long countByNmKsInstanceAndTiStatoTrasfIn(String nmKsInstance,
MonExecTrasf.STATO_TRASFORMAZIONE... tiStatoTrasf);

public Slice<MonExecTrasf> findByNmKsInstanceAndDtInizioTrasfBetweenAndTiStatoTrasfIn(Pageable pageable,
String nmKsInstance, Date startDate, Date endDate, MonExecTrasf.STATO_TRASFORMAZIONE... tiStatoTrasf);
public Slice<MonExecTrasf> findByNmKsInstanceAndDtInizioTrasfBetweenAndTiStatoTrasfInOrderByDtInizioTrasfDesc(
Pageable pageable, String nmKsInstance, Date startDate, Date endDate,
MonExecTrasf.STATO_TRASFORMAZIONE... tiStatoTrasf);

}
Original file line number Diff line number Diff line change
Expand Up @@ -27,6 +27,7 @@
import it.eng.parer.kettle.server.persistence.lite.dao.ReportRepository;
import it.eng.parer.kettle.service.DataService;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Date;
import java.util.List;
import org.apache.commons.lang.StringEscapeUtils;
Expand Down Expand Up @@ -61,6 +62,9 @@ public class DataServiceImpl implements DataService {
@Autowired
private ReportRepository reportRepository;

// MEV34451 TODO, leggere da file.
private final String[] blackListedParams = { "XF_KETTLE_DB_PASSWORD" };

@Override
@Transactional(propagation = Propagation.REQUIRES_NEW)
public boolean accettaTrasformazione(Trasformazione trasformazione) {
Expand Down Expand Up @@ -140,6 +144,12 @@ public void iniziaTrasformazione(Trasformazione trasformazione) {

StringBuilder parameters = new StringBuilder();
for (Parametro parametro : trasformazione.getParametri()) {

if (Arrays.stream(blackListedParams).anyMatch(parametro.getNomeParametro()::equals)) {
// MEV34451 salta i parametri blacklistati.
continue;
}

parameters = parameters.length() > 0
? parameters.append(" | ").append(parametro.getNomeParametro()).append(" : ")
.append(parametro.getValoreParametro())
Expand Down Expand Up @@ -396,9 +406,9 @@ public List<StatoTrasformazione> getStoricoTrasformazioni(Date startDate, Date e
List<StatoTrasformazione> storicoTrasformazioni = new ArrayList<>();

List<MonExecTrasf> monitoraggi = monitoraggioRepository
.findByNmKsInstanceAndDtInizioTrasfBetweenAndTiStatoTrasfIn(PageRequest.of(0, numResults),
ottieniParametroConfigurazione("config.instance_name"), startDate, endDate,
MonExecTrasf.STATO_TRASFORMAZIONE.ERRORE_TRASFORMAZIONE,
.findByNmKsInstanceAndDtInizioTrasfBetweenAndTiStatoTrasfInOrderByDtInizioTrasfDesc(
PageRequest.of(0, numResults), ottieniParametroConfigurazione("config.instance_name"),
startDate, endDate, MonExecTrasf.STATO_TRASFORMAZIONE.ERRORE_TRASFORMAZIONE,
MonExecTrasf.STATO_TRASFORMAZIONE.TRASFORMAZIONE_TERMINATA)
.getContent();

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,6 @@

package it.eng.parer.kettle.server.persistence.service;

import com.amazonaws.services.s3.model.S3Object;
import it.eng.parer.kettle.model.KettleCrudException;
import it.eng.parer.kettle.model.KettleJob;
import it.eng.parer.kettle.model.KettleTransformation;
Expand All @@ -34,8 +33,8 @@
import java.io.File;
import java.io.FileOutputStream;
import java.io.IOException;
import java.io.InputStream;
import java.io.OutputStream;
import java.net.URLEncoder;
import java.util.ArrayList;
import java.util.GregorianCalendar;
import java.util.List;
Expand Down Expand Up @@ -63,6 +62,8 @@
import org.springframework.scheduling.annotation.Async;
import org.springframework.stereotype.Service;
import org.springframework.transaction.annotation.Transactional;
import software.amazon.awssdk.core.ResponseInputStream;
import software.amazon.awssdk.services.s3.model.GetObjectResponse;

/**
*
Expand Down Expand Up @@ -481,7 +482,6 @@ private String preparaTrasformazioneDaObjectStorage(Trasformazione trasformazion
+ trasformazione.getIdOggettoPing());
}

S3Object s3Object = s3Client.getObject(oSBucket, oSKey);
File targetFileDirectory = new File(targetFileDestDirectory + File.separator + "INPUT_FILE",
FilenameUtils.getBaseName(targetFileDest));
File targetFile = new File(targetFileDirectory, FilenameUtils.getName(targetFileDest));
Expand All @@ -490,11 +490,12 @@ private String preparaTrasformazioneDaObjectStorage(Trasformazione trasformazion
targetFileDirectory.mkdirs();
}

try (InputStream in = s3Object.getObjectContent(); OutputStream os = new FileOutputStream(targetFile)) {

IOUtils.copyLarge(in, os);
try (OutputStream os = new FileOutputStream(targetFile)) {
ResponseInputStream<GetObjectResponse> s3Object = s3Client.getObject(oSBucket, oSKey);
IOUtils.copyLarge(s3Object, os);

} catch (Exception ex) {
LOGGER.error("Errore nel recupero del file da object storage per " + trasformazione.getIdOggettoPing(), ex);
throw new KettleException(
"Errore nel recupero del file da object storage per " + trasformazione.getIdOggettoPing());
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -17,27 +17,32 @@

package it.eng.parer.kettle.ws.client;

import com.amazonaws.auth.AWSStaticCredentialsProvider;
import com.amazonaws.auth.BasicAWSCredentials;
import com.amazonaws.client.builder.AwsClientBuilder;
import com.amazonaws.regions.Regions;
import com.amazonaws.services.s3.AmazonS3;
import com.amazonaws.services.s3.AmazonS3Client;
import com.amazonaws.services.s3.model.CompleteMultipartUploadRequest;
import com.amazonaws.services.s3.model.CompleteMultipartUploadResult;
import com.amazonaws.services.s3.model.InitiateMultipartUploadRequest;
import com.amazonaws.services.s3.model.InitiateMultipartUploadResult;
import com.amazonaws.services.s3.model.S3Object;
import com.amazonaws.services.s3.model.UploadPartRequest;
import com.amazonaws.services.s3.model.UploadPartResult;
import it.eng.parer.kettle.service.DataService;
import java.io.File;
import java.net.URI;
import java.time.Duration;
import javax.annotation.PostConstruct;
import javax.annotation.PreDestroy;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Component;
import software.amazon.awssdk.auth.credentials.AwsBasicCredentials;
import software.amazon.awssdk.auth.credentials.AwsCredentialsProvider;
import software.amazon.awssdk.auth.credentials.StaticCredentialsProvider;
import software.amazon.awssdk.awscore.exception.AwsServiceException;
import software.amazon.awssdk.core.ResponseInputStream;
import software.amazon.awssdk.core.exception.SdkClientException;
import software.amazon.awssdk.core.sync.RequestBody;
import software.amazon.awssdk.http.apache.ApacheHttpClient;
import software.amazon.awssdk.regions.Region;
import software.amazon.awssdk.services.s3.S3Client;
import software.amazon.awssdk.services.s3.model.DeleteObjectRequest;
import software.amazon.awssdk.services.s3.model.GetObjectRequest;
import software.amazon.awssdk.services.s3.model.GetObjectResponse;
import software.amazon.awssdk.services.s3.model.HeadObjectRequest;
import software.amazon.awssdk.services.s3.model.PutObjectRequest;
import software.amazon.awssdk.services.s3.model.S3Exception;

/**
*
Expand All @@ -51,7 +56,7 @@ public class S3ClientBean {
@Autowired
private DataService dataService;

private AmazonS3 awsClient;
private S3Client awsClient;

private boolean isActiveFlag = false;

Expand All @@ -72,12 +77,16 @@ private void init() {

// Istanzio il client http (possiede le chiamate al protocollo Amazon S3)
LOGGER.info("Sto per effettuare il collegamento all'endpoint S3 [ " + storageAddress + "]");
BasicAWSCredentials awsCreds = new BasicAWSCredentials(accessKeyId, secretKey);
awsClient = AmazonS3Client.builder()
.withEndpointConfiguration(
new AwsClientBuilder.EndpointConfiguration(storageAddress, Regions.US_EAST_1.name()))
.withCredentials(new AWSStaticCredentialsProvider(awsCreds))
.withPathStyleAccessEnabled(Boolean.TRUE).build();

final AwsCredentialsProvider credProvider = StaticCredentialsProvider
.create(AwsBasicCredentials.create(accessKeyId, secretKey));

awsClient = S3Client.builder().endpointOverride(URI.create(storageAddress)).region(Region.US_EAST_1)
.credentialsProvider(credProvider).forcePathStyle(true)
.httpClientBuilder(ApacheHttpClient.builder().maxConnections(100)
.connectionTimeout(Duration.ofMinutes(1L)).socketTimeout(Duration.ofMinutes(10L)))
.build();

LOGGER.info("########## CLIENT S3 INIZIALIZZATO ###################");
} else {
LOGGER.info("########## CLIENT S3 DISATTIVO ###################");
Expand All @@ -91,42 +100,46 @@ private void init() {
private void destroy() {
LOGGER.info("Shutdown endpoint S3...");
if (awsClient != null) {
awsClient.shutdown();
awsClient.close();
}
}

public void deleteObject(String bucketName, String key) {
awsClient.deleteObject(bucketName, key);
DeleteObjectRequest delOb = DeleteObjectRequest.builder().bucket(bucketName).key(key).build();
awsClient.deleteObject(delOb);
}

public S3Object getObject(String bucketName, String key) {
return awsClient.getObject(bucketName, key);
}
public ResponseInputStream<GetObjectResponse> getObject(String bucketName, String key) throws Exception {
try {
GetObjectRequest getObjectRequest = GetObjectRequest.builder().bucket(bucketName).key(key).build();
return awsClient.getObject(getObjectRequest);

public boolean doesObjectExist(String bucketName, String key) {
return awsClient.doesObjectExist(bucketName, key);
} catch (AwsServiceException | SdkClientException e) {
LOGGER.error("impossibile ottenere dal bucket " + bucketName + " oggetto con chiave " + key, e);
throw new Exception("impossibile ottenere dal bucket " + bucketName + " oggetto con chiave " + key, e);
}
}

public void putObject(String bucketName, String nomeFilePacchetto, File file) {
awsClient.putObject(bucketName, nomeFilePacchetto, file);
}
public boolean doesObjectExist(String bucketName, String key) {
HeadObjectRequest objectRequest = HeadObjectRequest.builder().key(key).bucket(bucketName).build();

public void putObject(String bucketName, String nomeFilePacchetto, String content) {
awsClient.putObject(bucketName, nomeFilePacchetto, content);
}
try {
awsClient.headObject(objectRequest);
return true;

public InitiateMultipartUploadResult initiateMultipartUpload(
InitiateMultipartUploadRequest initiateMultipartUploadRequest) {
return awsClient.initiateMultipartUpload(initiateMultipartUploadRequest);
} catch (S3Exception e) {
return false;
}
}

public CompleteMultipartUploadResult completeMultipartUpload(
CompleteMultipartUploadRequest completeMultipartUploadRequest) {
return awsClient.completeMultipartUpload(completeMultipartUploadRequest);
public void putObject(String bucketName, String nomeFilePacchetto, File file) {
PutObjectRequest putOb = PutObjectRequest.builder().bucket(bucketName).key(nomeFilePacchetto).build();
awsClient.putObject(putOb, RequestBody.fromFile(file));
}

public UploadPartResult uploadPart(UploadPartRequest uploadPartRequest) {
return awsClient.uploadPart(uploadPartRequest);
public void putObject(String bucketName, String nomeFilePacchetto, String content) {
PutObjectRequest putOb = PutObjectRequest.builder().bucket(bucketName).key(nomeFilePacchetto).build();
awsClient.putObject(putOb, RequestBody.fromString(content));
}

public boolean isActive() {
Expand Down
2 changes: 1 addition & 1 deletion parer-kettle-service/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@
<parent>
<groupId>it.eng.parer</groupId>
<artifactId>parer-kettle</artifactId>
<version>2.0.1-SNAPSHOT</version>
<version>2.1.1-SNAPSHOT</version>
</parent>
<artifactId>parer-kettle-service</artifactId>
<packaging>jar</packaging>
Expand Down
Loading

0 comments on commit a6411db

Please sign in to comment.