1
- # Prerequisites:
1
+ = Prerequisites:
2
2
3
- ## Run Locally:
3
+ == Run Locally:
4
4
5
5
Mysql 5.7 server running and database `test` initialized.
6
6
7
- ## Run in Kubernetes
7
+ == Run in Kubernetes
8
8
9
9
Use Dataflow deployed in K8s:
10
10
11
- `` `bash
12
- dataflow>app register -- name batch-remote-partition -- type task -- uri "docker://springcloud/batch-remote-partition:0.0.1-SNAPSHOT
11
+ [source,bash]
12
+ ----
13
+ dataflow>app register --name batch-remote-partition --type task --uri "docker://springcloud/batch-remote-partition:0.0.1-SNAPSHOT"
13
14
dataflow>task create batch-remote-partition --definition batch-remote-partition
14
15
dataflow>task launch batch-remote-partition --properties "deployer.*.kubernetes.deploymentServiceAccountName=scdf-data-flow" --arguments "--platform=kubernetes --artifact=docker://springcloud/batch-remote-partition"
15
- ```
16
+ ----
16
17
17
- ### Build image to minikube registry
18
- ```
18
+ === Build image to minikube registry
19
+
20
+ [source,bash]
21
+ ----
19
22
eval $(minikube docker-env)
20
23
./mvnw clean package jib:dockerBuild
21
- ```
24
+ ----
22
25
23
- ### Publish updated docker image to `springcloud`:
26
+ === Publish updated docker image to `springcloud`:
24
27
25
- ```bash
28
+ [source,bash]
29
+ ----
26
30
$./mvnw clean package jib:build -Djib.to.auth.username= -Djib.to.auth.password=
27
- ```
31
+ ----
28
32
29
- ## Run in Cloudfoundry
33
+ == Run in Cloudfoundry
30
34
31
- ### Publish updated jar to `repo.spring.io`
35
+ === Publish updated jar to `repo.spring.io`
32
36
33
37
Set the credentials in ~/.m2/settings.xml
34
38
35
- ```xml
39
+ [source,xml]
40
+ ----
36
41
<settings>
37
42
<servers>
38
43
<server>
@@ -42,22 +47,22 @@ Set the credentials in ~/.m2/settings.xml
42
47
</server>
43
48
</servers>
44
49
</settings>
45
- ```
50
+ ----
46
51
47
- ```bash
52
+ [source,bash]
53
+ ----
48
54
$./mvnw clean deploy
49
- ```
55
+ ----
50
56
51
57
Use Dataflow deployed in Cloudfoundry:
52
58
53
59
In this case we need to provide the internal CloudFoundry Deployer instance with the same CF environment configuration that SCDF uses.
54
60
Setting these properties on the task definition will end up as environment variables in the app container.
55
61
56
- ```bash
57
- dataflow:>app register --name batch-remote-partition --type task --uri maven://org.springframework.cloud.dataflow.acceptence.tests:batch-remote-partition:0.0.1-SNAPSHOT
62
+ [source,bash]
63
+ ----
64
+ dataflow:>app register --name batch-remote-partition --type task --uri "maven://org.springframework.cloud.dataflow.acceptence.tests:batch-remote-partition:0.0.1-SNAPSHOT"
58
65
dataflow:>task create batch-remote-partition --definition "batch-remote-partition --spring.cloud.deployer.cloudfoundry.password=***** --spring.cloud.deployer.cloudfoundry.username=<username> --spring.cloud.deployer.cloudfoundry.org=<org> --spring.cloud.deployer.cloudfoundry.space=<space> --spring.cloud.deployer.cloudfoundry.url=<url> --spring.cloud.deployer.cloudfoundry.skipSslValidation=true"
59
66
# Default artifact will work here.
60
67
dataflow:>task launch batch-remote-partition --arguments "--platform=cloudfoundry"
61
- ```
62
-
63
-
68
+ ----
0 commit comments