Just wanted to post my solution for processing one file at a time in a Switch flow, since none of the methods I've seen thus far have worked for my use case. The flow is attached as a zip file.
In this flow, jobs are stored in a waiting folder as an indication that the flow is busy. When the flow's activities are complete, the job is deleted from that folder, allowing the next job to be submitted.
- The outgoing connection of the Hold job element is set to Space Jobs Apart 10 seconds. This ensures that no more than one job ever gets released at a time.
- A simple PowerShell script (code below) counts the number of PDF files in a folder passed to it in the command line. (If you're not a PowerShell fan this could be JavaScript if you prefer, using the "Script Private Data" element from the Appstore, or a simple find on a Mac)
- If there is a nonzero number of files returned by the script, the job is sent back to the held folder. Since all the folders up to this point are Auto-managed, these operations are very low overhead.
- If there are zero files returned by the script, the job is stored locally using a Store it element and is then passed through to the rest of the flow.
- The flow can go about its business handling that job. (Simulated in this example by a "Hold job" element)
- Once the flow is finished, the job is sent to an Inject Wildcard element that deletes the job of the same name from the local store, and is then discarded.
Code: Select all
param(
[Parameter(Mandatory=$true)]
[string]$folderPath
)
Write-Host (Get-ChildItem $folderPath\*.pdf | Measure-Object).Count;