Dynamically define number of jobs to Assemble

Post Reply
rowen
Newbie
Posts: 11
Joined: Fri Oct 12, 2018 10:16 am

Dynamically define number of jobs to Assemble

Post by rowen »

Hello,

Here at the company we're still new to Switch, we've only had it for a few months. So there's a lot still to learn and figure out as we get more experience.
At the moment we are trying to automate a particular task and are having a hard time figuring out how to solve it.

We have the licenses for the Switch Core Engine, and the Configurator, Metadata and Scripting modules.

I'll try to describe the process the best I can and also what we've tried so far.
If there's any other information about the process, the flows, or the configurators that I can provide that would be useful, just let me know.

Every day we do the following procedure:
  • We receive an email with the name of the .pdf files that will be arriving that day, among other characteristics.

    The email isn’t necessarily the first thing to arrive. Sometimes it is, other times we have already received some .pdf files when the email arrives.
  • We receive an N number of .pdf files

    They don’t arrive together, they arrive individually throughout the afternoon, to a shared folder. We don’t know the time they start or end arriving.
  • We grab all the .pdf files received that day and merge into a single .pdf.
    
The order of the merge isn’t relevant, all we need is to make it a single .pdf.
  • We process the final .pdf.

What we want to achieve is a an automated process, which receives the email data and the .pdf files, and merges the .pdf files into a one, by either comparing the number of files expected for the day, or comparing the names of the files to the ones on the email, which we can then process as we like.


With Switch we have managed to automate some steps of the process:
  • Receive the email as .txt and use a script to process its information to determine the total number of expected .pdf files. It should be easy to get the names of the files instead, if we’d have to. (view attachment: mail-receive-process.png)
  • Merge the .pdf files into a single one with the Merge configurator, by (manually) passing a Folder with the .pdf files inside.
    or
  • Having a fixed number of files, use the Assemble Job configurator to assemble the .pdf files into a Job Folder, and then using the Merge configurator to join them into one. (view attachment: assemble-merge.png)
What we’re missing is the bridge between the two parts, that is, dynamically defining the number (or the name) of files that should be assembled into a Job Folder.


We have tried the following:
  • On the Script, after processing the file, assigning the total number of expected files to a GlobalData variable.

    Result: We can’t access the variable from the Assemble Job configurator
  • On the Script, after processing the file, assigning the total number of expected files to a PrivateData key, and send this .txt file as a job to the Assemble Job configurator, solely so it can read its PrivateData.

    Result: The configurator receives the variable as the job arrives, but as soon another job (one of the .pdf files for merging) arrives, the variable no longer exists in this new job, and the .pdf is sent to the Problem Jobs folder.
  • Use a Switch.Counter, which is updated in the Script, to save the total number of expected files.

    Result: The configurator waits for the number of files to be reached, but for some reason, the Counter seems to keep increasing and the flow won’t proceed.
 Also, we can’t reset the Counter.
  • Make a loop which receives the .pdf files and does Assemble Job and Disassemble Job until the total number of files in the Assembled job is equal to the number in the Counter.
 (view attachment: assemble-loop.png)
    Result: The Counter seems unreliable, as it seems to keep incrementing with no reason. We tried using brand new counters and running the script only once, and even activate the flow once without deactivating, and placing the exact number of files as the counter all at once, and it still failed.
Despite that, this option doesn’t seem to be practical or efficient, being as it processes files several times without real need for it.
We haven’t tried:
  • Using the Hold Job configurator, with a release condition

    We suppose this won’t solve our issue, if we keep using the Counter as a comparison element.

Thank you,
Owen
Attachments
mail-receive-process.png
mail-receive-process.png (28.8 KiB) Viewed 5563 times
assemble-merge.png
assemble-merge.png (14.35 KiB) Viewed 5563 times
assemble-loop.png
assemble-loop.png (32.24 KiB) Viewed 5564 times
Padawan
Advanced member
Posts: 363
Joined: Mon Jun 12, 2017 8:48 pm
Location: Belgium
Contact:

Re: Dynamically define number of jobs to Assemble

Post by Padawan »

rowen wrote: Fri Oct 12, 2018 12:53 pm
  • On the Script, after processing the file, assigning the total number of expected files to a GlobalData variable.

    Result: We can’t access the variable from the Assemble Job configurator
I expect you should be able to access the globaldata from the Assemble Job using a script expression. Have you tried this?

Also, you might want to keep in mind the situation that on Monday you have 1 incoming file, which sets the global data to "1". On Tuesday you have 10 incoming files, but the first incoming file arrives before the Tuesday mail arrives. At that point the global data is still "1", so your flow might already assemble at this point.

I would personally fix this by at the end of the day setting the private data to "9999" so it still set to this the next morning. This will the incoming files will never arrive (assuming you never have 9999 incoming PDF files). Then when your mail arrives it can overwrite "9999" with the correct number of PDF's for that day.
rowen
Newbie
Posts: 11
Joined: Fri Oct 12, 2018 10:16 am

Re: Dynamically define number of jobs to Assemble

Post by rowen »

Padawan wrote: Fri Oct 12, 2018 2:19 pm I expect you should be able to access the globaldata from the Assemble Job using a script expression. Have you tried this?
I had tried it before, but didn't manage to get it working.

After your response I tried it again, but added a line to unlock the GlobalData first, which fixed it. Like so:

Code: Select all

s.unlockGlobalData();
s.getGlobalData("incomingFiles", "totalFiles");
Do I always have to unlock GlobalData before using it in a script expression?

In any case, it worked! Thank you very much!
Padawan wrote: Fri Oct 12, 2018 2:19 pm Also, you might want to keep in mind the situation that on Monday you have 1 incoming file, which sets the global data to "1". On Tuesday you have 10 incoming files, but the first incoming file arrives before the Tuesday mail arrives. At that point the global data is still "1", so your flow might already assemble at this point.
Yes, I figured that might happen. Thanks for the suggestion.
I'm using a Hold Job configurator that receives the mail.txt that is processed at the beginning and holds it till the end of the day, when it releases the job to a new script that resets the GlobalData variable.

Do you think that's a good solution, or is there a better way to run a script?
I don't really have any need for the mail.txt, after the data is processed…

Thank you
Padawan
Advanced member
Posts: 363
Joined: Mon Jun 12, 2017 8:48 pm
Location: Belgium
Contact:

Re: Dynamically define number of jobs to Assemble

Post by Padawan »

I didn’t expect it to be unnecessary to unlock the global data for read access.

Did you remove the lock after your wrote the global data in it?

If it turns out that the behaviour is not as your expect, then you might report it to Enfocus. They will either explain it, fix it or clarify the documentation :)
Post Reply