Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Release 4.6.1 #38

Merged
merged 20 commits into from
Oct 31, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
20 commits
Select commit Hold shift + click to select a range
2ed4036
Bump junit from 4.10 to 4.13.1
dependabot[bot] Oct 13, 2020
d0f3f82
Bump gson from 2.3 to 2.8.9 in /ncsa-common-clowder
dependabot[bot] May 20, 2022
6420855
Bump gson from 2.3 to 2.8.9 in /ncsa-common-incore
dependabot[bot] May 20, 2022
1b4ef47
Bump jsch from 0.1.48 to 0.1.54 in /datawolf-executor-hpc
dependabot[bot] Jul 6, 2022
52a7a0c
Merge pull request #22 from ncsa/main
navarroc Feb 18, 2023
ba0645a
Set version to 4.7.0-SNAPSHOT and update CHANGELOG
navarroc Feb 21, 2023
f2eeb74
Fixes 23 - added exception to log print statement
navarroc Apr 21, 2023
9e89d02
Merge pull request #2 from ncsa/dependabot/maven/junit-junit-4.13.1
navarroc Apr 21, 2023
ecc2f14
Merge pull request #26 from ncsa/23-kubernetes-executor-does-not-prin…
navarroc Jul 7, 2023
4d84a5d
Merge pull request #6 from ncsa/dependabot/maven/ncsa-common-incore/c…
navarroc Sep 6, 2024
0521797
Merge pull request #5 from ncsa/dependabot/maven/ncsa-common-clowder/…
navarroc Sep 6, 2024
f944669
Merge pull request #7 from ncsa/dependabot/maven/datawolf-executor-hp…
navarroc Sep 6, 2024
2a7afce
Fixes #28 - added the user/creator information to environment of proc…
navarroc Sep 13, 2024
c841266
Fixes #29 changed IN-CORE Dataset DAO and FileStorage implementation …
navarroc Sep 24, 2024
2f1536b
Upgrade hsql to 2.7.1 (#32)
ylyangtw Sep 24, 2024
0dea50b
Fixes #33 - updated custom properties to include more configuration v…
navarroc Sep 30, 2024
7e5c67d
Extra Variables added for INCORE specific (#36)
ywkim312 Oct 24, 2024
7b39cf4
Release 4.6.1
ywkim312 Oct 29, 2024
035d8c0
renamed to 4.7.0
ywkim312 Oct 30, 2024
d9b36af
removed the word snapshot from all pom.xml
ywkim312 Oct 30, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
12 changes: 12 additions & 0 deletions CHANGELOG.md
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

looking at all the changes i think this is 4.7.0 and not just some changes to the helm chart.
make sure to remove the SNAPSHOT from all the poms

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Changed to release 4.7.0. Also updated pom.xml files by removing the SNAPSHOT. Why this happens? Is this automatic update? Also, I see some other version like 3.1.0-SNAPSHOT in other datawolf components, such as datawolf-excutor-commandline-ui or others. Should this also be updated to 4.7.0?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

With a maven build, you can create as many SNAPSHOT builds as you want; however, you can only push a single release so when we do a release, we remove SNAPSHOT and when we merge to main, this will push a release version (4.7.0).

You can ignore the 3.1.0-SNAPSHOT. They are no longer used or haven't been used in a long time and aren't part of the standard build.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I see. Thanks for the explanation. If things look good. Let's get it merged and released so we can test datawolf in incore tst cluseter

Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,18 @@ All notable changes to this project will be documented in this file.
The format is based on [Keep a Changelog](http://keepachangelog.com/)
and this project adheres to [Semantic Versioning](http://semver.org/).

## [4.7.0] - 2024-10-29

### Added
- User information to environment of process running the tool [#28](https://github.com/ncsa/datawolf/issues/28)
- Ability for extra environment variables (used by IN-CORE) [#35](https://github.com/ncsa/datawolf/issues/35)

### Changed
- IN-CORE Dataset DAO and FileStorage implementation to use latest API [#29](https://github.com/ncsa/datawolf/issues/29)
- Kubernetes executor prints exception [#23](https://github.com/ncsa/datawolf/issues/23)
- Upgrade hsqldb to 2.7.3 [#27](https://github.com/ncsa/datawolf/issues/27)
- Custom properties to include more configuration variables [#33](https://github.com/ncsa/datawolf/issues/33)

## [4.6.0] - 2023-02-15

### Added
Expand Down
7 changes: 3 additions & 4 deletions charts/datawolf/Chart.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ version: 1.0.1
# This is the version number of the application being deployed. This version number should be
# incremented each time you make changes to the application. Versions are not expected to
# follow Semantic Versioning. They should reflect the version the application is using.
appVersion: 4.6.0
appVersion: 4.7.0

# List of people that maintain this helm chart.
maintainers:
Expand All @@ -38,6 +38,5 @@ dependencies:
# annotations for artifact.io
annotations:
artifacthub.io/changes: |
- add ability to set dataset permission (public/private)
- fix Chart.yaml
- fix ingress (deploy at /datawolf)
- User information to environment of process running the tool
- Ability for extra environment variables (used by IN-CORE)
3 changes: 3 additions & 0 deletions charts/datawolf/templates/deployment.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -94,6 +94,9 @@ spec:
value: {{ .Values.jobs.cpu | quote }}
- name: KUBERNETES_MEMORY
value: {{ .Values.jobs.memory | quote }}
{{- if .Values.extraEnvVars }}
{{ .Values.extraEnvVars | toYaml | nindent 12 }}
{{- end }}
volumeMounts:
- name: {{ include "datawolf.fullname" . }}
mountPath: /home/datawolf/data
Expand Down
2 changes: 2 additions & 0 deletions charts/datawolf/values.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -33,6 +33,8 @@ jobs:
# default memory in GB per job
memory: 4.0

extraEnvVars: {}

serviceAccount:
# Specifies whether a service account should be created
create: true
Expand Down
2 changes: 1 addition & 1 deletion datawolf-core/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@
<parent>
<groupId>edu.illinois.ncsa</groupId>
<artifactId>datawolf</artifactId>
<version>4.6.0</version>
<version>4.7.0</version>
</parent>
<artifactId>datawolf-core</artifactId>

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -36,6 +36,8 @@
public abstract class Executor {
private static Logger logger = LoggerFactory.getLogger(Executor.class);

protected static String DATAWOLF_USER = "DATAWOLF_USER";

private StringBuilder log = new StringBuilder();
private LogFile logfile = new LogFile();
private int lastsave = 0;
Expand Down
10 changes: 5 additions & 5 deletions datawolf-doc/doc/manual/index.html
Original file line number Diff line number Diff line change
Expand Up @@ -247,18 +247,18 @@

https://opensource.ncsa.illinois.edu/projects/artifacts.php?key=WOLF

By default, the latest release is selected in the page (currently 4.6.0). To get early access to development releases, check the box **Show also prereleases.**
By default, the latest release is selected in the page (currently 4.7.0). To get early access to development releases, check the box **Show also prereleases.**

* Click on **Version**
* Select **4.6.0**
* Under **Files** select **datawolf-webapp-all-4.6.0-bin.zip**
* Select **4.7.0**
* Under **Files** select **datawolf-webapp-all-4.7.0-bin.zip**
* Click **I Accept** to accept the License.

This will give you the latest stable build that includes both the Data Wolf Server and the Web Editor. You can also find links to the javacode there as well as the manual. The link to the source code can be found at the end of this document.

### Installation and Setup

To install the files necessary for the Server and Editor, find where you downloaded Data Wolf and unzip it somewhere. This will create a folder called **datawolf-webapp-all-4.6.0**. In the next few sections, we'll discuss some of the important files that come with the installation you just unzipped so you can tailor your setup to meet your needs. If you wish to skip this, you can go directly to the section **Running Data Wolf Server and Editor**.
To install the files necessary for the Server and Editor, find where you downloaded Data Wolf and unzip it somewhere. This will create a folder called **datawolf-webapp-all-4.7.0**. In the next few sections, we'll discuss some of the important files that come with the installation you just unzipped so you can tailor your setup to meet your needs. If you wish to skip this, you can go directly to the section **Running Data Wolf Server and Editor**.

#### Data Wolf properties

Expand Down Expand Up @@ -443,7 +443,7 @@

#### Launch Scripts

If you go back to the folder **Data Wolf-webapp-all-4.6.0** you will see a sub-folder called **bin**, open this. Inside you will find two scripts, **datawolf-service** and **datawolf-service.bat**. The latter is intended for running Data Wolf on a Windows machine and the former is for running on Mac & Linux. As with the previous section, knowledge of this file is not required unless you are interested in configuring the Data Wolf Server and Editor beyond the default settings. We will show snippets of the file **datawolf-service** and discuss what each section is configuring.
If you go back to the folder **Data Wolf-webapp-all-4.7.0** you will see a sub-folder called **bin**, open this. Inside you will find two scripts, **datawolf-service** and **datawolf-service.bat**. The latter is intended for running Data Wolf on a Windows machine and the former is for running on Mac & Linux. As with the previous section, knowledge of this file is not required unless you are interested in configuring the Data Wolf Server and Editor beyond the default settings. We will show snippets of the file **datawolf-service** and discuss what each section is configuring.

```
# port for the jetty server
Expand Down
2 changes: 1 addition & 1 deletion datawolf-doc/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
<parent>
<groupId>edu.illinois.ncsa</groupId>
<artifactId>datawolf</artifactId>
<version>4.6.0</version>
<version>4.7.0</version>
</parent>
<artifactId>datawolf-doc</artifactId>
</project>
2 changes: 1 addition & 1 deletion datawolf-domain/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@
<parent>
<groupId>edu.illinois.ncsa</groupId>
<artifactId>datawolf</artifactId>
<version>4.6.0</version>
<version>4.7.0</version>
</parent>
<artifactId>datawolf-domain</artifactId>

Expand Down
2 changes: 1 addition & 1 deletion datawolf-editor/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@
<parent>
<groupId>edu.illinois.ncsa</groupId>
<artifactId>datawolf</artifactId>
<version>4.6.0</version>
<version>4.7.0</version>
</parent>
<packaging>war</packaging>
<artifactId>datawolf-editor</artifactId>
Expand Down
2 changes: 1 addition & 1 deletion datawolf-executor-commandline/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@
<parent>
<artifactId>datawolf</artifactId>
<groupId>edu.illinois.ncsa</groupId>
<version>4.6.0</version>
<version>4.7.0</version>
</parent>
<artifactId>datawolf-executor-commandline</artifactId>

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -108,6 +108,12 @@ public void execute(File cwd) throws AbortException, FailedException {
env.putAll(impl.getEnv());
}

// Add user to the environment in case a tool needs this information
if(execution.getCreator() != null) {
String user = execution.getCreator().getEmail();
env.put(DATAWOLF_USER, user);
}

// find the app to execute
command.add(findApp(impl.getExecutable().trim(), cwd));

Expand Down Expand Up @@ -170,9 +176,14 @@ public void execute(File cwd) throws AbortException, FailedException {
throw (new FailedException("Could not get input file.", e));
}
} else {

// Create a folder for the datasets
File inputFolder = new File(filename);
if (inputFolder.exists() && inputFolder.getAbsolutePath().startsWith(System.getProperty("java.io.tmpdir"))) {
// For single file, a tmp file got created above; however in this case, we need
// a temporary folder to store the files
inputFolder.delete();
}

if (!inputFolder.mkdirs()) {
throw (new FailedException("Could not create folder for input files"));
}
Expand Down Expand Up @@ -251,6 +262,7 @@ public void execute(File cwd) throws AbortException, FailedException {
sb.append(" ");
}
println("Executing : " + sb.toString());
logger.debug("Executing : " + sb.toString());

// create the process builder
ProcessBuilder pb = new ProcessBuilder(command);
Expand Down Expand Up @@ -369,11 +381,11 @@ public void execute(File cwd) throws AbortException, FailedException {
ds.setTitle(step.getTool().getOutput(impl.getCaptureStdOut()).getTitle());
ds.setCreator(execution.getCreator());

ds = datasetDao.save(ds);

ByteArrayInputStream bais = new ByteArrayInputStream(stdout.toString().getBytes("UTF-8"));
FileDescriptor fd = fileStorage.storeFile(step.getTool().getOutput(impl.getCaptureStdOut()).getTitle(), bais, execution.getCreator(), ds);

ds = datasetDao.save(ds);

execution.setDataset(step.getOutputs().get(impl.getCaptureStdOut()), ds.getId());
saveExecution = true;
} catch (IOException exc) {
Expand All @@ -385,11 +397,11 @@ public void execute(File cwd) throws AbortException, FailedException {
Dataset ds = new Dataset();
ds.setTitle(step.getTool().getOutput(impl.getCaptureStdErr()).getTitle());
ds.setCreator(execution.getCreator());
ds = datasetDao.save(ds);

ByteArrayInputStream bais = new ByteArrayInputStream(stderr.toString().getBytes("UTF-8"));
FileDescriptor fd = fileStorage.storeFile(step.getTool().getOutput(impl.getCaptureStdErr()).getTitle(), bais, execution.getCreator(), ds);

ds = datasetDao.save(ds);

execution.setDataset(step.getOutputs().get(impl.getCaptureStdErr()), ds.getId());
saveExecution = true;
Expand Down Expand Up @@ -419,15 +431,15 @@ public boolean accept(File pathname) {
for (File file : files) {
logger.debug("adding files to a dataset: " + file);
FileInputStream fis = new FileInputStream(file);
fileStorage.storeFile(file.getName(), fis, ds.getCreator(), ds);
fileStorage.storeFile(file.getName(), fis, execution.getCreator(), ds);
fis.close();
}

} else {
FileInputStream fis = new FileInputStream(entry.getValue());
fileStorage.storeFile(new File(entry.getValue()).getName(), fis, ds.getCreator(), ds);
fileStorage.storeFile(new File(entry.getValue()).getName(), fis, execution.getCreator(), ds);
}
ds = datasetDao.save(ds);
// ds = datasetDao.save(ds);

execution.setDataset(step.getOutputs().get(entry.getKey()), ds.getId());
saveExecution = true;
Expand Down
4 changes: 2 additions & 2 deletions datawolf-executor-hpc/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@
<parent>
<artifactId>datawolf</artifactId>
<groupId>edu.illinois.ncsa</groupId>
<version>4.6.0</version>
<version>4.7.0</version>
</parent>
<artifactId>datawolf-executor-hpc</artifactId>

Expand All @@ -27,7 +27,7 @@
<dependency>
<groupId>com.jcraft</groupId>
<artifactId>jsch</artifactId>
<version>0.1.48</version>
<version>0.1.54</version>
</dependency>
</dependencies>
</project>
2 changes: 1 addition & 1 deletion datawolf-executor-java-tool/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@
<parent>
<artifactId>datawolf</artifactId>
<groupId>edu.illinois.ncsa</groupId>
<version>4.6.0</version>
<version>4.7.0</version>
</parent>
<artifactId>datawolf-executor-java-tool</artifactId>

Expand Down
2 changes: 1 addition & 1 deletion datawolf-executor-java/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@
<parent>
<artifactId>datawolf</artifactId>
<groupId>edu.illinois.ncsa</groupId>
<version>4.6.0</version>
<version>4.7.0</version>
</parent>
<artifactId>datawolf-executor-java</artifactId>

Expand Down
2 changes: 1 addition & 1 deletion datawolf-executor-kubernetes/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
<parent>
<groupId>edu.illinois.ncsa</groupId>
<artifactId>datawolf</artifactId>
<version>4.6.0</version>
<version>4.7.0</version>
</parent>
<artifactId>datawolf-executor-kubernetes</artifactId>
<dependencies>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -159,10 +159,17 @@ public State submitRemoteJob(File cwd) throws AbortException, FailedException {

// Create a folder for the datasets
File inputFolder = new File(filename);
if (inputFolder.exists()) {
// For single file, a tmp file got created above; however in this case, we need
// a temporary folder to store the files
inputFolder.delete();
}

if (!inputFolder.mkdirs()) {
throw (new FailedException("Could not create folder for input files"));
}


int duplicate = 1;
for (FileDescriptor fd : ds.getFileDescriptors()) {
String localFileName = fd.getFilename();
Expand Down Expand Up @@ -285,8 +292,25 @@ public State submitRemoteJob(File cwd) throws AbortException, FailedException {
container.args(command);
// add any environment variables
if (!impl.getEnv().isEmpty()) {
// TODO implement
//container.addEnvItem();
Map<String, String> environment = impl.getEnv();

for (Map.Entry<String, String> entry : environment.entrySet()) {
String key = entry.getKey();
String value = entry.getValue();
V1EnvVar envVar = new V1EnvVar();
envVar.setName(key);
envVar.setValue(value);
container.addEnvItem(envVar);
}
}

// Add user to the environment in case a tool needs this information
if(execution.getCreator() != null) {
String user = execution.getCreator().getEmail();
V1EnvVar envVar = new V1EnvVar();
envVar.setName(DATAWOLF_USER);
envVar.setValue(user);
container.addEnvItem(envVar);
}

// add resource limits
Expand Down Expand Up @@ -318,7 +342,7 @@ public State submitRemoteJob(File cwd) throws AbortException, FailedException {
throw e;
} catch (FailedException e) {
// Job could not be submitted, set state to waiting to try again
logger.info("Job not submitted because the job scheduler appears to be down, will try again shortly...");
logger.info("Job not submitted because the job scheduler appears to be down, will try again shortly...", e);
return State.WAITING;
// throw e;
} catch (Throwable e) {
Expand Down Expand Up @@ -384,12 +408,11 @@ public State checkRemoteJob() throws FailedException {
Dataset ds = new Dataset();
ds.setTitle(step.getTool().getOutput(impl.getCaptureStdOut()).getTitle());
ds.setCreator(execution.getCreator());
ds = datasetDao.save(ds);

ByteArrayInputStream bais = new ByteArrayInputStream(lastlog.getBytes("UTF-8"));
FileDescriptor fd = fileStorage.storeFile(step.getTool().getOutput(impl.getCaptureStdOut()).getTitle(), bais, execution.getCreator(), ds);

ds = datasetDao.save(ds);

execution.setDataset(step.getOutputs().get(impl.getCaptureStdOut()), ds.getId());
saveExecution = true;
}
Expand Down Expand Up @@ -419,15 +442,15 @@ public boolean accept(File pathname) {
for (File file : files) {
logger.debug("adding files to a dataset: " + file);
FileInputStream fis = new FileInputStream(file);
fileStorage.storeFile(file.getName(), fis, ds.getCreator(), ds);
fileStorage.storeFile(file.getName(), fis, execution.getCreator(), ds);
fis.close();
}

} else {
FileInputStream fis = new FileInputStream(entry.getValue());
fileStorage.storeFile(new File(entry.getValue()).getName(), fis, ds.getCreator(), ds);
fileStorage.storeFile(new File(entry.getValue()).getName(), fis, execution.getCreator(), ds);
}
ds = datasetDao.save(ds);
// ds = datasetDao.save(ds);

execution.setDataset(step.getOutputs().get(entry.getKey()), ds.getId());
saveExecution = true;
Expand Down
2 changes: 1 addition & 1 deletion datawolf-jpa/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@
<parent>
<artifactId>datawolf</artifactId>
<groupId>edu.illinois.ncsa</groupId>
<version>4.6.0</version>
<version>4.7.0</version>
</parent>
<artifactId>datawolf-jpa</artifactId>
<packaging>jar</packaging>
Expand Down
2 changes: 1 addition & 1 deletion datawolf-provenance/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@
<parent>
<groupId>edu.illinois.ncsa</groupId>
<artifactId>datawolf</artifactId>
<version>4.6.0</version>
<version>4.7.0</version>
</parent>
<packaging>war</packaging>
<artifactId>datawolf-provenance</artifactId>
Expand Down
2 changes: 1 addition & 1 deletion datawolf-service-client/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@
<parent>
<groupId>edu.illinois.ncsa</groupId>
<artifactId>datawolf</artifactId>
<version>4.6.0</version>
<version>4.7.0</version>
</parent>
<artifactId>datawolf-service-client</artifactId>

Expand Down
2 changes: 1 addition & 1 deletion datawolf-service/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@
<parent>
<artifactId>datawolf</artifactId>
<groupId>edu.illinois.ncsa</groupId>
<version>4.6.0</version>
<version>4.7.0</version>
</parent>
<artifactId>datawolf-service</artifactId>
<dependencies>
Expand Down
Loading
Loading