Web Service API - good practices

Post Reply
tristan.juhe
Newbie
Posts: 3
Joined: Mon Jan 25, 2021 9:55 pm

Web Service API - good practices

Post by tristan.juhe »

Hi all,

We are using the WebService API on Switch (2020 Fall) to create and track jobs progression and we have some misunderstood behaviours that you can maybe clarify for us.

When we create a new job, we immediately call the "jobs list" API method to retrieve the "job id" to be able to track it later. It works fine.
After some time, we replay the "jobs list" method to follow the job progression and then we can see the original created job has status "completed" and a new job (with a different ID but with same name and same "family") is processing. It seems the original job has been replaced by this new one...
When we try to confirm these behaviour we check the Switch UI and as you can see in the attached screenshots, the Switch UI shows 2 differents jobs with 2 differents prefix : 00IIA and 00IIC.
The API we can also see 2 jobs but with the same "family" 00IIA. The others properties have same values.

Can you please explain the difference between "prefix" in the UI and "family" in the API.

As I explain sooner we try to track the job from the start to the end of our workflow but it seems to be difficult/impossible if the first created job we catch is not the one we have to follow then.
It seems that if we use a linear workflow without any fork then the original job is keeped until the end, but it means very limited possibilities (and no recycle bin).

Last question, is there an elegant way to put a job in error status as we cannot not naturally link the "problem jobs" element ?

Thank you very much for all your help.

BR.
Tristan.

ImageImage
freddyp
Advanced member
Posts: 1008
Joined: Thu Feb 09, 2012 3:53 pm

Re: Web Service API - good practices

Post by freddyp »

Every new individual job gets a unique prefix. When a file gets processed into a new file (preflight result, conversion, ...), that new file gets its own prefix, but at the same time it inherits everything from the original job. If you put private data and attach a dataset to a PDF file and then you convert the PDF to an image, that image has the same private data and the same dataset. The relationship does not have to be one-to-one. If you have a job folder with 20 files in it and you ungroup or dismantle the job folder, you get 20 new jobs that inherit from the 1 job folder, and all these 20 jobs belong to the same job family. it is a concept that is only present and accessible in the scripting API.

Based on what you describe you want to do I advise you to use GraphQL: https://www.enfocus.com/manuals/Develop ... ardGraphQL. It gives you a lot more control over the query and in the query you can include "flow stage". If you have the Reporting Module the queries also work on historical data and not just on live data.

Putting a job in an error status is done by the element that processes it. It can do that by using traffic-light connections or by failing the job (send to Problem jobs), sometimes both: if PitStop Server detects errors in the preflight it goes to the error connection, if the file is not a PDF it goes to Problem jobs. You can make a job fail by using remote processing with the remote processing API, but I fail to see the use case.
tristan.juhe
Newbie
Posts: 3
Joined: Mon Jan 25, 2021 9:55 pm

Re: Web Service API - good practices

Post by tristan.juhe »

Hi freedyp, thank you for your answer.

I just want to be sure I undestand your explanation about the job management in Switch ;)
I agree with the unique prefix for each job, this is what we see on the UI, so when a new file is created (preflight result, conversion, according to your examples) each new file has a new prefix.
In ou case (see our simple workflow screeenshot) this is always the same file, we just have some simple forks to going to recycle bin to keep traces.
So as I understand, the "prefix" in UI is not same concept as "family" in API. It looks likes it is but for some reason it's not the same.
So without programmatic script there is no way to decide if you want to keep the original job as the main one to continue in the worklow.

About GraphQL, we also try to use it but there is some annoying things :
- job ID is not the same as the API : it’s the job prefix (like in UI) so reconcialiation with API is not easy
- we don't have the job status
- there is not date (initiated) so jobs tracking is not simple

There is some screenshot and dump from API and GraphQL response to picture these limitations.

About the job error, we have follow your advice to send a job to the error status (it works as you describe) but the API does not show the job status as error. When we check the API documentation, it seems that error status is for "jobs that could not be picked up by Switch". Another deception here :/

After all we will do something different of our first thought and stop trying to follow our jobs.
I guess we will just display the jobs list API response as is (maybe make a "family" unification) and keep an eye on API updates hoping to see some benefic enhancements.

Thank you very luch for your help.
BR
Image
Image
jan_suhr
Advanced member
Posts: 586
Joined: Fri Nov 04, 2011 1:12 pm
Location: Nyköping, Sweden

Re: Web Service API - good practices

Post by jan_suhr »

To clear this a bit. The unique prefix is a way for Switch to keep track of every file that is routed through Switch. This mean that if you enter one file in a flow it will get a prefix ID. Then you have a folder with three outgoing connections which means that switch has to make two copies of the file so it can send three files further on in the flow. The two new files need a Unique ID for Switch to keep track of the new files. When your flow is done with the job it often isn't the file with the input jobs ID that comes out in the other end.

The solution for you is to add an job order identifies as Private Data on the input folder. This Private Data tag will follow the job all the way to the output folder. Another way is to add this identifier to the filename.

How you get it there is another question but it can be solved an a few different ways, mostly depending on where you will get this information from.
Jan Suhr
Color Consult AB
Sweden
=============
Check out my apps
tristan.juhe
Newbie
Posts: 3
Joined: Mon Jan 25, 2021 9:55 pm

Re: Web Service API - good practices

Post by tristan.juhe »

When your flow is done with the job it often isn't the file with the input jobs ID that comes out in the other end.
=> absolutely right ! To keep the original job id we try to linearised our workflow and avoid file transformations...
The solution for you is to add an job order identifies as Private Data on the input folder. This Private Data tag will follow the job all the way to the output folder. Another way is to add this identifier to the filename.
=> yes, that's what we do using metadata on the Submit Point. Then we use the "Attach job state" folder feature to update the job state because this property is retrievable with the API (list jobs method) so we can monitor the job.

We are in touch with Enfocus support to see how we can have a better and more elegant way to handle this.

Thx.
Post Reply