Export Metadata

Post Reply
strido
Member
Posts: 32
Joined: Tue Jan 19, 2016 7:02 pm

Export Metadata

Post by strido »

I think I'm missing a simple step here. I've connected to a database, and from there I'd like to export metadata. There's an error stating that the node requires one outgoing log connection.

I've put a "Log Job Info" node in but the error won't go away.
User avatar
gabrielp
Advanced member
Posts: 645
Joined: Fri Aug 08, 2014 4:31 pm
Location: Boston
Contact:

Re: Export Metadata

Post by gabrielp »

That error is saying you need a connection that accepts the log data type. So make a new connection, and you should see a property on that connection saying something like "Data, Log, or Data with log". Add a "Log" one and the error should go away.
Free Switch scripts: open-automation @ GitHub
Free Switch apps: open-automation @ Enfocus appstore

Want to hire me? I'm looking for my next gig. Contact me on LinkedIn or via email.
strido
Member
Posts: 32
Joined: Tue Jan 19, 2016 7:02 pm

Re: Export Metadata

Post by strido »

Hey look at that...

Thanks!
strido
Member
Posts: 32
Joined: Tue Jan 19, 2016 7:02 pm

Re: Export Metadata

Post by strido »

This is gonna be a while...

What would a flow look like, grabbing information from the database, and then exporting the metadata?

I did manage to find some sample flows but none for exporting metadata. I'm trying to get it to read the database, and then make a decision based on the information in the database. For example, put Letter sized files in one folder and legal in another.
User avatar
gabrielp
Advanced member
Posts: 645
Joined: Fri Aug 08, 2014 4:31 pm
Location: Boston
Contact:

Re: Export Metadata

Post by gabrielp »

I don't see why you need to export the metadata to do that. In Switch, export usually refers to taking the data from a job and removing it from Switch and doing something with it. I think you mean exporting from your database. What you'd need to do is just get the metadata for jobs that go through your flow. Sounds like you're already doing that. Now, you can route the jobs based on that metadata.

Many of us simplify and abstract away our datasets as private data keys. If you don't want to get into this, you can just replace the simple PD keys I'll give in this example ([Job.PrivateData:Key="Shawmut Customer ID"]) with the longer variable expression for your dataset value ([Metadata.Text:Path="/field-list/field[2]/value",Dataset="Submit",Model="XML"]).

As jobs move through your workflow, they pass through a database configurator which pulls data from your MIS (or other database). That data is tied to the job as an external dataset or private data (probably, depending on what method you're using). You now have what you need to route the job.

Let's say we route the job through a configurator that gets the finished size of the piece from the database. Perhaps we pass this configurator your job and component numbers which you previously resolved from the file name. Now, on the other size of that configurator, you should have the result ("11x17") available to you via a Switch variable ([Job.PrivateData:Key="MyPrefix Finished Size"]). Now you make an outbound connector from the output folder to another folder with the criteria include jobs where [Job.PrivateData:Key="MyPrefix Finished Size"] == "11x17". Then, another outbound connector where "All other jobs" is selected. Now your flow should route 11x17 jobs to the first folder and all others to the second folder.

Perhaps that's too specific and you have too many sizes and you wanted all jobs larger than a certain size in one folder or another. Well, you can use Switch variables to split the 11 and 17 from "11x17" or use your database query to pull them separately. Now you can use Switch to make an outbound connector with criteria like include jobs where [Job.PrivateData:Key="MyPrefix Finished Width"] > "10.5" which would also be true for our example job.
Free Switch scripts: open-automation @ GitHub
Free Switch apps: open-automation @ Enfocus appstore

Want to hire me? I'm looking for my next gig. Contact me on LinkedIn or via email.
strido
Member
Posts: 32
Joined: Tue Jan 19, 2016 7:02 pm

Re: Export Metadata

Post by strido »

I think you mean exporting from your database. What you'd need to do is just get the metadata for jobs that go through your flow. Sounds like you're already doing that. Now, you can route the jobs based on that metadata.
Unfortunately I'm not even that far along yet.

At minimum, how would i get the metadata and export it to an XML? Is metadata auto-generated?
User avatar
gabrielp
Advanced member
Posts: 645
Joined: Fri Aug 08, 2014 4:31 pm
Location: Boston
Contact:

Re: Export Metadata

Post by gabrielp »

strido wrote:At minimum, how would i get the metadata and export it to an XML? Is metadata auto-generated?
You don't need to export it as XML unless you need to use it in another application or something. You just need to get the information from your database accessible from Switch. This is done with Job Metadata and its a fundamental concept with Switch. As your job routes through your flow, it can pick up new metadata (private data declarations, database queries, checkpoint selections, etc...) which can be used for further routing. The configurator that handles the database query will do that for you. I don't know how the database module works, so I can't speak to that and perhaps that's where the confusion is coming from.

I can walk you through this one. Even if you don't use it, perhaps the concepts will help you.

Set that script up with your odbc credentials. Use a direct query that you know is valid (hard code your variables):

Code: Select all

SELECT * FROM Jobs WHERE JobNumber = '123456' 
Now, set the result type to dataset. Pass a job through it, but hold the job before a folder (to prevent Switch from removing the unique prefix which tracks its metadata). Now, stop the flow, modify any connector to change the "include these jobs" and select "define condition with variables". Add a new left side condition, go to the Metadata category, then choose Text, then next to the Path field, click the arrow and "Build location path". Then, you should see a new window, click around in the external datasets and you should see the database information for job '123456'.

You can click on any of these variables and place them into the condition as a variable expression. When you're done with the left side, you can compare it to something, finalizing your connector condition. It's always good practice to have a secondary connector that has "All other jobs" when doing this.

Now that you've got the basics working, now you can make it work by replacing '123456' in your query with a Switch variable ([Job.PrivateData:Key="JobNumber"] perhaps). Better yet, you can replace it with a parametrized query which allow you to insert :jobNumber as placeholder and will attempt to remove special characters which could be used for SQL injection. All optional stuff you can get into later.
Free Switch scripts: open-automation @ GitHub
Free Switch apps: open-automation @ Enfocus appstore

Want to hire me? I'm looking for my next gig. Contact me on LinkedIn or via email.
Post Reply