Skip to content

Commit 17f04b9

Browse files
committed
Add README files and disable generateSpec for all plugins
- Add README.md files for all plugins following the nextflow-plugin-gradle requirements - Each README includes: Summary, Get Started, Examples, Resources, and License sections - Disable generateSpec in build.gradle for all plugins (temporary workaround) Signed-off-by: Paolo Di Tommaso <[email protected]>
1 parent 59c4f4e commit 17f04b9

File tree

19 files changed

+755
-51
lines changed

19 files changed

+755
-51
lines changed

plugins/gradle.properties

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,3 @@
11
## plugins settings
2-
nextflowPluginVersion=1.0.0-beta.10
2+
nextflowPluginVersion=1.0.0-beta.11
33
nextflowPluginProvider=nextflow-io

plugins/nf-amazon/README.md

Lines changed: 87 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,87 @@
1+
# Nextflow plugin for Amazon Web Services
2+
3+
This plugin provides support for Amazon Web Services (AWS), including AWS Batch as a compute executor, S3 as a file system, and Fusion file system for high-performance data operations.
4+
5+
## Get Started
6+
7+
To use this plugin, add it to your `nextflow.config`:
8+
9+
```groovy
10+
plugins {
11+
id 'nf-amazon'
12+
}
13+
```
14+
15+
Configure your AWS credentials using environment variables, AWS CLI profiles, or IAM roles. Then set up the executor and work directory:
16+
17+
```groovy
18+
process.executor = 'awsbatch'
19+
process.queue = '<YOUR BATCH QUEUE>'
20+
workDir = 's3://<YOUR BUCKET>/work'
21+
22+
aws {
23+
region = 'us-east-1'
24+
batch {
25+
cliPath = '/home/ec2-user/miniconda/bin/aws'
26+
}
27+
}
28+
```
29+
30+
## Examples
31+
32+
### Basic AWS Batch Configuration
33+
34+
```groovy
35+
plugins {
36+
id 'nf-amazon'
37+
}
38+
39+
process.executor = 'awsbatch'
40+
process.queue = 'my-batch-queue'
41+
workDir = 's3://my-bucket/work'
42+
43+
aws {
44+
region = 'eu-west-1'
45+
batch {
46+
cliPath = '/home/ec2-user/miniconda/bin/aws'
47+
jobRole = 'arn:aws:iam::123456789:role/MyBatchJobRole'
48+
}
49+
}
50+
```
51+
52+
### Using Fusion File System
53+
54+
```groovy
55+
fusion {
56+
enabled = true
57+
}
58+
59+
wave {
60+
enabled = true
61+
}
62+
63+
process.executor = 'awsbatch'
64+
workDir = 's3://my-bucket/work'
65+
```
66+
67+
### S3 Storage Options
68+
69+
```groovy
70+
aws {
71+
client {
72+
maxConnections = 20
73+
connectionTimeout = 10000
74+
storageEncryption = 'AES256'
75+
}
76+
region = 'us-east-1'
77+
}
78+
```
79+
80+
## Resources
81+
82+
- [AWS Batch Executor Documentation](https://nextflow.io/docs/latest/aws.html)
83+
- [Amazon S3 Storage Documentation](https://nextflow.io/docs/latest/aws.html#s3-storage)
84+
85+
## License
86+
87+
[Apache License 2.0](https://www.apache.org/licenses/LICENSE-2.0)

plugins/nf-amazon/build.gradle

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -25,6 +25,7 @@ nextflowPlugin {
2525
description = 'Provides comprehensive AWS cloud integration including Batch executor, S3 file system, and Fusion support for high-performance data operations'
2626
className = 'nextflow.cloud.aws.AmazonPlugin'
2727
useDefaultDependencies = false
28+
generateSpec = false
2829
extensionPoints = [
2930
'nextflow.cloud.aws.batch.AwsBatchExecutor',
3031
'nextflow.cloud.aws.config.AwsConfig',

plugins/nf-azure/README.md

Lines changed: 63 additions & 50 deletions
Original file line numberDiff line numberDiff line change
@@ -1,70 +1,83 @@
1-
# Azure plugin for Nextflow
1+
# Nextflow plugin for Microsoft Azure
22

3-
This plugin implements the support for Azure Blob storage as file system
4-
provider (via JSR203 interface) and Azure Batch executor for Nextflow.
3+
This plugin provides support for Azure Blob Storage as a file system and Azure Batch as a compute executor for Nextflow pipelines.
54

6-
## Development
5+
## Get Started
76

8-
Build Nextflow as usual:
7+
To use this plugin, add it to your `nextflow.config`:
98

10-
```bash
11-
make compile
9+
```groovy
10+
plugins {
11+
id 'nf-azure'
12+
}
1213
```
1314

14-
Use the following Nextflow configuration:
15+
Configure your Azure credentials and services:
16+
17+
```groovy
18+
azure {
19+
storage {
20+
accountName = '<YOUR STORAGE ACCOUNT NAME>'
21+
accountKey = '<YOUR STORAGE ACCOUNT KEY>'
22+
}
23+
24+
batch {
25+
endpoint = 'https://<YOUR BATCH ACCOUNT NAME>.<REGION>.batch.azure.com'
26+
accountName = '<YOUR BATCH ACCOUNT NAME>'
27+
accountKey = '<YOUR BATCH ACCOUNT KEY>'
28+
}
29+
}
30+
```
31+
32+
Set the executor and work directory:
33+
34+
```groovy
35+
process.executor = 'azurebatch'
36+
workDir = 'az://<YOUR CONTAINER>/work'
37+
```
38+
39+
## Examples
40+
41+
### Basic Azure Batch Configuration
1542

1643
```groovy
1744
plugins {
18-
id 'nf-azure'
45+
id 'nf-azure'
1946
}
2047
2148
azure {
22-
storage {
23-
accountKey = "<YOUR STORAGE ACCOUNT KEY>"
24-
accountName = "<YOUR STORAGE ACCOUNT KEY>"
25-
}
26-
27-
batch {
28-
endpoint = 'https://<YOUR BATCH ACCOUNT NAME>.westeurope.batch.azure.com'
29-
accountName = '<YOUR BATCH ACCOUNT NAME>'
30-
accountKey = '<YOUR BATCH ACCOUNT KEY>'
31-
}
49+
storage {
50+
accountName = 'mystorageaccount'
51+
accountKey = System.getenv('AZURE_STORAGE_KEY')
52+
}
53+
54+
batch {
55+
endpoint = 'https://mybatchaccount.westeurope.batch.azure.com'
56+
accountName = 'mybatchaccount'
57+
accountKey = System.getenv('AZURE_BATCH_KEY')
58+
autoPoolMode = true
59+
deletePoolsOnCompletion = true
60+
}
3261
}
3362
3463
process.executor = 'azurebatch'
35-
workDir = 'az://<YOUR DATA CONTAINER>/work'
64+
workDir = 'az://mycontainer/work'
3665
```
3766

38-
Then test the local build as usual:
67+
### Using Managed Identity
3968

40-
```bash
41-
./launch.sh run -c nextflow.config rnaseq-nf
69+
```groovy
70+
azure {
71+
managedIdentity {
72+
clientId = '<YOUR MANAGED IDENTITY CLIENT ID>'
73+
}
74+
}
4275
```
4376

44-
## Todo
45-
46-
* Currently, the Blob storage service uses NettyHttpClient and Batch service
47-
uses OkHttp client, duplicating the number of required libraries. In principle
48-
the Blob service can use OkHttp, adding the following deps, however using that
49-
Nextflow hangs during the shutdown, apparently because the connection pool used
50-
by the blob service is not closed timely.
51-
52-
```groovy
53-
compile('com.azure:azure-storage-blob:12.9.0') {
54-
exclude group: 'org.slf4j', module: 'slf4j-api'
55-
exclude group: 'com.azure', module: 'azure-core-http-netty'
56-
}
57-
compile('com.azure:azure-core-http-okhttp:1.3.3') {
58-
exclude group: 'org.slf4j', module: 'slf4j-api'
59-
}
60-
```
61-
62-
* Remove invalid directory from .command.run PATH for project having `bin/` folder
63-
* Add the configuration for the region
64-
* Make the backend endpoint optional
65-
66-
### Additional Resources
67-
68-
* https://github.com/Azure/azure-sdk-for-java/wiki
69-
* https://github.com/Azure/azure-sdk-for-java/tree/master/sdk/storage/azure-storage-blob-nio
70-
* https://github.com/Azure/azure-sdk-for-java/blob/master/sdk/storage/azure-storage-blob-nio/src/samples/java/com/azure/storage/blob/nio/ReadmeSamples.java
77+
## Resources
78+
79+
- [Azure Batch Executor Documentation](https://nextflow.io/docs/latest/azure.html)
80+
81+
## License
82+
83+
[Apache License 2.0](https://www.apache.org/licenses/LICENSE-2.0)

plugins/nf-azure/build.gradle

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -25,6 +25,7 @@ nextflowPlugin {
2525
description = 'Enables Azure cloud execution through Batch service with native Blob storage access and comprehensive authentication options'
2626
className = 'nextflow.cloud.azure.AzurePlugin'
2727
useDefaultDependencies = false
28+
generateSpec = false
2829
extensionPoints = [
2930
'nextflow.cloud.azure.batch.AzBatchExecutor',
3031
'nextflow.cloud.azure.config.AzConfig',

plugins/nf-cloudcache/README.md

Lines changed: 84 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,84 @@
1+
# Nextflow plugin for cloud cache
2+
3+
This plugin provides cloud-based caching support for Nextflow pipelines, enabling workflow resume capability when using cloud storage as the work directory.
4+
5+
## Get Started
6+
7+
To use this plugin, add it to your `nextflow.config`:
8+
9+
```groovy
10+
plugins {
11+
id 'nf-cloudcache'
12+
}
13+
```
14+
15+
The plugin is automatically activated when using cloud storage (S3, GS, Azure Blob) as the work directory with resume enabled.
16+
17+
```groovy
18+
workDir = 's3://my-bucket/work'
19+
```
20+
21+
Run your pipeline with the `-resume` flag:
22+
23+
```bash
24+
nextflow run main.nf -resume
25+
```
26+
27+
## Examples
28+
29+
### AWS S3 Cache
30+
31+
```groovy
32+
plugins {
33+
id 'nf-amazon'
34+
id 'nf-cloudcache'
35+
}
36+
37+
workDir = 's3://my-bucket/work'
38+
39+
aws {
40+
region = 'us-east-1'
41+
}
42+
```
43+
44+
### Google Cloud Storage Cache
45+
46+
```groovy
47+
plugins {
48+
id 'nf-google'
49+
id 'nf-cloudcache'
50+
}
51+
52+
workDir = 'gs://my-bucket/work'
53+
54+
google {
55+
project = 'my-project'
56+
location = 'us-central1'
57+
}
58+
```
59+
60+
### Azure Blob Storage Cache
61+
62+
```groovy
63+
plugins {
64+
id 'nf-azure'
65+
id 'nf-cloudcache'
66+
}
67+
68+
workDir = 'az://my-container/work'
69+
70+
azure {
71+
storage {
72+
accountName = 'mystorageaccount'
73+
accountKey = System.getenv('AZURE_STORAGE_KEY')
74+
}
75+
}
76+
```
77+
78+
## Resources
79+
80+
- [Nextflow Cache and Resume](https://nextflow.io/docs/latest/cache-and-resume.html)
81+
82+
## License
83+
84+
[Apache License 2.0](https://www.apache.org/licenses/LICENSE-2.0)

plugins/nf-cloudcache/build.gradle

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -25,6 +25,7 @@ nextflowPlugin {
2525
description = 'Implements cloud-based caching system to optimize workflow performance by reducing redundant computations and data transfers'
2626
className = 'nextflow.CloudCachePlugin'
2727
useDefaultDependencies = false
28+
generateSpec = false
2829
extensionPoints = [
2930
'nextflow.cache.CloudCacheFactory'
3031
]

0 commit comments

Comments
 (0)