Multiple File/Folder Inject

Post Reply
apietrocini
Member
Posts: 32
Joined: Fri Mar 24, 2017 7:06 pm
Location: Cleveland, OH

Multiple File/Folder Inject

Post by apietrocini »

Hello All,

I'm sure this is been addressed before, but can't seem to find a thread discussing a solution. What I'm trying to do is create a job archive process through switch. I can execute a db query to return all jobs that have been closed that day. The problem I'm running into is when I get a result with multiple values. Has anyone been able to inject multiple Files or Folders in a flow based on a dataset...? Example: db query returns job number values 123, 456, 789, and 999. I want to go to our job server and remove job folders 123,456, etc... inject them into switch, compress them and move them into a different directory. Any help would be appreciated...

Anthony Pietrocini
User avatar
gabrielp
Advanced member
Posts: 645
Joined: Fri Aug 08, 2014 4:31 pm
Location: Boston
Contact:

Re: Multiple File/Folder Inject

Post by gabrielp »

I think there's a lot of ways to skin this cat -- looking forward to other's responses.

Based on your description I guess I would write a little script to create jobs with the name of each job number in my array. So if my dataset value was "123, 124, 125, 126" I would send the triggering file into the script and output:
- 123.txt
- 124.txt
- 125.txt
- 126.txt

Then pass each of those slips through the inject.

Here are some scripts that create dummy job for reference:
- https://github.com/open-automation/switch-job-repeater
- https://github.com/open-automation/swit ... -job-clock

But if you are closing the loop on the archiving (e.g. like writing back to the database that it has been archived), then you could look into a system that does one at a time instead of all at once. Perhaps you modify your query to select "the first job that needs to be archived" instead of "all jobs that need to be archived". In this case, it would return "123" which you can feed into the inject and then write back to the database. The next time the flow is called, it would pull out "124", etc... You could use something like dummy job clock to run this at a constant interval. This has the advantage of controlling the flow of archiving so you don't overload your switch server all at once. You can set a rate limit of 4 jobs per hour or something.

You could even combine both approaches which could allow you to avoid writing a custom script to parse your array, first querying the total number of jobs to be archived (4) and then feeding that into switch-job-repeater which creates 4 files. Then, one by one (with a hold job or something) feed those files into "the first job that needs to be archived", archiving them one at a time.
Free Switch scripts: open-automation @ GitHub
Free Switch apps: open-automation @ Enfocus appstore

Want to hire me? I'm looking for my next gig. Contact me on LinkedIn or via email.
Post Reply