Hey FME'ers. I've not been online for a bit recently (because of updating the FME training materials for 2017) but I wanted to throw out a new idea that is part challenge, but part crowdsourcing.
I came up with the idea of an FME-driven service for assessing a workspace for best practices. Just now I found I wasn't the first to think of this, which shows there aren't any new ideas any more!
Anyway, in my spare time I've been putting together a workspace using the FMW reader to test other workspaces for best practice. I've built about 20 different tests and can think of quite a few more. But I thought it would be great if we - the FME community - could work together to take this idea to completion.
So, this is an open invite to take part in this project. I think there are a number of different ways you could contribute:
I've shared all the files in a folder on Dropbox. Anyone in the FME community is welcome to access this folder and use the contents for whatever you like. It just occurred to me that it might be better off in GitHub, so I'll do that as a future task.
If you want to contribute a test, then try to get a feel for the workspace style, pick a test that isn't done, and go for it. I haven't done any tests around transformers yet, so there is a lot still to do. And there may be reader/writer tests I haven't thought of. Preferably make a copy of the workspace, since we don't have proper revision control (yet). I'd like to keep it in 2016.1 or earlier for the moment, so no-one has to install a beta version.
I'm also open to any and all other ideas about how to go about this, and how to collaborate on a project like this. As far as I know, there's never been a crowdsourced FME project before!
My end goal is to get this online and hosted in FME Cloud, so we can make a proper web service out of it. My idea is that everyone who contributes would get recognition on the web page (and a custom KnowledgeCentre badge of course)!
So, let me know what you think - and if you want to contribute then please do so.
Awesome idea as always. We've had this one come up as a possibility as well, I even made a little start for it. My ideas were slightly different from yours, though the checks look familiar.
One idea is to come up with an overal score, on a grade between 0 and 100, where 100 is the best workspace ever. You can make a number of categories to give points on, and you get deductions if something is 'wrong'.
At the time I made a small list of categories, I'll post them here for inspiration.
This matches quite well with the list already available, but adding valuation instead of just error or warn, so you can take a quick glance to determine which ones need work, then use the warn/error information to fix specific problems.
A possible error as well: is Redicrect to Data Inspector enabled?
I'd like to join in with this because it seems like a fun idea. Regarding organization: FME workspaces are very annoying to merge, which is why putting it in version control is such a pain. I'd suggest making one master version available, moderated by @Mark2AtSafe. Provide a location where people can upload suggested changes, which @Mark2AtSafe can manually merge back into master. This version definitely needs to be in version control, so I agree with @sigtill to put this up on GitHub.
A few ideas for tests off the top of my head, not sure if it's possible to implement these :)
Would be great if this was run right after you hit "Publish to FME Server" on the File menu of FME Workbench - so you could validate it BEFORE you publish it to an FME Server. Or more in generall - add an option to run a custom workspace before "Publish to FME Server" to validate something in that particular fmw workspace - for instance that all your database-connections are connected to staging/dev/prod and with the correct user / paths etc. When uploading to multiple FME Servers dev/stage/prod this can easily be forgotten!
My contribution. Updated 1.Report Header to add more information regarding the file itself.
Very interesting project Mark, here's a little contribution by me: a table with the transformer histogram. bestpracticereportgenerator-hm.zip
I haven't included the contribution by @sigtill in mine, think it would be best if we come up with a smart way of handling more conributions, GitHub seems the best way to do that.
Some other ideas:
Great workbench. I have been testing it on some of my recent work and noticed it would be helpful to add href links to FME help documentation.
In a few areas of the report this could be applied, such as WorkspaceProperties, Annotation, Bookmarks...
For me the most important things are linked with a 'stranger' allowing to make adaptations to your workspace without making mistakes a quickly as possible. These include:
1. Provide an annotation when making use of a special setting that might influence the flow (e.g. Suppliers first in the FeautureMerger).
2. For a transformer allowing custom functionality (SQLExecutor / InlineQuerier / PythonCaller). Provide comments in the code itself AND provide an annotation stating the functionality briefly.
3. Make sure the flow of the workspace is clear when completely zooming out.
Don't know if there's a tool already out there, but maybe something to take a look at log files of the workspaces that we all too commonly ignore to raise warnings about tasks that are taking above "normal" or a threshold. Brownie points for a way to compile performance stats from FME server job logs ;)
For how we're using FME Server, a workflow that runs every 10 minutes that is optimized by 1 second could save almost 2 1/2 minutes a day (or 14 1/2 hours per year).
FME Challenge: The Game of Life 6 Answers
FME Challenge: A Pointless Quiz 5 Answers
FME Challenge: Scientific Workspaces 7 Answers