Thursday, November 15, 2012

Reusability of Data Modules


A little while ago the subject of re-usability has raised its head in the S1000D users Google group. In addition there was a recent DCL Labs announcement of a webinar entitled "Stop Repeating Yourself… Reuse &; Get Cleaned Up!" who clearly also see re-usability as a serious subject for their organisation.

It is certainly a subject which is at the heart of the specification and allows solid long-term technical information maintenance at reduced costs due to non duplication of data. This is particularly important in projects which are going to be in service for 25+ years (and that is not counting the inevitable equipment enhancements to allow it to stay in service for even longer).

Some projects actually have little scope for re-using data modules. Unless of course there are warnings and cautions which crop up throughout the project.

Simple Examples

In some industries simple things like the removal and refitting of panels provide a good simple example of somewhere where common modules can be used. Often more complex jobs get broken down into small procedures which are common to several different maintenance functions.

Good Analysis Up Front 

How much can be reused does, nearly always, rely on a good analysis on the equipment before starting on the work. A good hard look at the breakdown of the equipment, the location of the various items within the 'target platform', and what maintenance is expected really helps to focus on this area of creating S1000D DMs. One of the basic facts about S1000D has, in my observation of several projects, been spending time up front prior to generating the data to save time down stream when modification, updates etc are made. The more common features etc that can be found to start with the less duplicate work later on. Although I do appreciate the pressures that are placed on the production of technical information this investment up front does pay dividends later on when production schedules can be even tighter sometimes being adversely affected by

Common Items Not Recognised in Some Projects

At the other extreme I know of one project here where water system valves of a particular type and size are used throughout the several systems on the platform.  This was not recognised by the person who did the breakdown to create the Data Module List. This resulted in identical information appearing throughout the equipment with completely different Data Module Codes. Of course changes to these valves, due to modification, changes in working practices etc needed a heap of extra work to incorporate the changes.

The above does not take into account the case when several platforms have common COTS (and even specialised) equipment. In these cases the relevant Data Modules can be used in more than one project. I know that some projects do not like this but it has been proved several times that data modules complying with different Issue/Changes of S1000D can live together in one project publication. This can be allowed for in the initial analysis.

In one project I remember the Technical Publications Officer had been told specifically that Data Modules from different issues could not be used together in the same publication. This was something that we (in a previous company) were able to prove, by demonstraton, to be wrong. Cross referencing to other data modules worked fine - which was one of their main concerns.

Quality of DMs

One thing that has 'under-whelmed' me is the quality of some of the material that has crossed my desk over the recent years. Although some of the work should have been done to S1000D the resulting Data Modules do not seem to reflect the ethos of the Specification. Some Data Modules I have seen are clearly almost publications in their own right but they have been 'poured' into a single descriptive data module.

I suppose even now the question raised here is "Have the various staff in positions of authority understood what S1000D is about." During one training course that I did I witnessed a delegate change from aggressive antagonism to 100% support as we moved through the days. I do see this as an extreme case but if Management understands the benefits of S1000D they do tend to be more supportive towards the staff that actually have to implement the specification.

Has anyone done the sums?

 At another place and with another modular specification I carried out a cost benefit analysis for a publication chosen for us by a group of people in the Consumer Electronics Business (in the USA). In the User Guide family that was provided for analysis there was a great degree of duplication. Whole areas of the Owners Guide was duplicated in the Maintenance Book so this really did not do justice to the analysis exercise. However the saving in actual initial publication item and costs was very significant, multiplied of course by the saving on a whole family of similar projects (you know the sort of thing I am sure - different models in the same range with improved specifications but the same basic controls and layout etc).
I realise that this may be a bit far down the line for S1000D projects but has any one carried out a similar analysis comparing a 'Traditional Publication' with an 'S1000D based Publication'? If so is it possible to share this work. I would be happy to make it available (converting the text into S1000D of course) on the Mekon website.

3 comments:

Anonymous said...

Hi Martyn,
Thanks for your blog and your timely insight. ¬¬I'm in the process of developing a DMRL for a new aircraft. With my limited experience, it seems wrong to go thru the expense and learning curve of S1000D without benefiting from the specification’s pièce de résistance; elimination of duplicate content and tight effectivity control. I'm assigning Removal and Installation Info Codes to each LRU. Many components also have Servicing, Adjustment/Test, Inspection/Check, Operational Check, Disassembly and Reassembly Info Codes as well. I anticipate some of these tasks being eliminated under further review, however to not start with a highly granular approach seems inconsistence with the spec.
In reviewing other S1000D IETMs (private enterprise) we’ve been able purchase, I’m surprised at the lack of granularity many of the publications have employed. It’s common to see Removal / Installation or Inspection / Adjustment tasks sharing the same DM. Without granularity, the benefit of leveraging the DMs in a LMS or production floor Work Instructions is reduced. Why bother if your content can’t be efficiently reused (legacy issues and other constraints not taken into account)? One might as well use the many powerful unstructured authoring and scripting tools that are currently available
I am concerned about efficient application of effectivity once our authoring starts. As we’re still in the development phase, I’m not totally fluent with our authoring tool. From the research and training I’ve undertaken it seems that employing our highly complex applicability will require overly-high attention to detail full of pitfalls. I hope to see authoring tools to help in applicability tagging. More time authoring will also address this concern.
Another concern with usi¬ng a highly granular approach is IETM navigation. Jumping from DM to DM to DM to DM is not user friendly. This broken, interrupted navigation scenario is slightly concerning.
Best Regards and Merry Christmas.

TCooke said...

Martyn,
I feel the pain of the previous comments in all the issues of setting up a DMRL. Your comments about S1000D bring to light the problem with the current environment where we have a very good environment (S1000D 4.0) with legacy tools forcing us back into the same old bad practices.
DMRL generation is as simple as - Assign every installation the same common codes for description, locations, inspection, operation test, functional test, removal, installation, parts. Add others as required. The DMRL is not built in advance, it is built from the bottom up as "discoveries" of need are revealed by engineering. Icing on the cake for title page, introductions and etc. A massive amount of front matter is auto-generated by PROPERLY tagged content.
Applicability is not effectivity. The old complex Boolean evaluation statements are now redundant if the PCT is used as a data filter. Example: If you construct a complex Boolean expression, you must match its constraints in the PCT (redundant). Skip the complexity and just tag up the relevant assert in the data and let the PCT handle the dependencies. I have been successful in demonstrating that tool issues are restraining progress in this area.
XSL style sheets to the rescue.
Are we trying to cover data needs in the DMRL that can be manipulated in the processing? Properly identified data (tagged) can be extracted and excessively granulated data can be avoided.
The parents of properly behaved children can send them to publications without their siblings. Disciplined tagging results in maximum reuse, not just DMRL granularity.
Navigation requires a style sheet model with multi-path thinking. If you are trapped in a dead-leg, blame the style sheet. Hyperlinks (and menus) are easy and just require imagination to function effectively.
This is a new year. Lets begin again with a clean sheet and leave the old players at the door.

Anonymous said...

TCooke,
Thanks for your insight. I'm interested in your comment about tool issues restraining progress in applicability. Can you clarify?

I look forward to future coorespondence as our project matures.
Regards -
Tedrick