Simply use call and put the tasks into the array list to chain them through
The example below shows that the private vars scope will be shared and common for all callee tasks
In this case, the combination of call and detailed callee task implementation would create a scope manageable via firstly extending global runtime vars, then pass them through to inject needed vars as args to callee tasks
tasks:
-
name: Main
task:
-
func: call
vars:
a: local-var-a
b: local-var-b
do:
- task1
- task2
So the task1 could obtain three types of inputs:
So the do action could make use of implicit global vars and local private vars, the behavior could also be controlled by injected caller vars
To make your code auto aware the context, always put them into scope
Putting the vars to different layers is flexible, the basic guideline is to start it with vars declaration in global runtime, eg
vars:
a: aa
tasks:
-
name: Main
task:
dox:
The benefit doing this is that you can lift them up to scope profile group, then global group, or if it is more specific to a task, then define them directly in func, in this way
Or, you could start with all local vars in func, then if you think they are common to all tasks and can be shared across, then lift them up to global runtime scope. This practice normally used for adhoc test, quick mockup etc.
There is no task vars declaration tag, like vars in func, or in scope group, but you can use dvar flag to mark a var to be a taskvar scope, then it will be shared in the entire task. The reason not to use task var tag is to reduce complexity and increase the composibilty instead.
call func is really really powerful tool. It uses IOC (inverse of controll) to inject needed vars to callee, while overriding and merging global runtime vars.
global vars <-- [override and merge] call func [override] -> callee func [vars injected from call]