Transcription

Validating Software forManufacturing Processesby David A. Vogel, Ph.D.Intertech Engineering Associates, Inc.as published in Medical Device & Diagnostic Industry,May 2006The software for medical device processes—engineering, quality, regulatory, and so on—mustbe validated. You don’t have to be a softwareengineer to do it. The software for medical deviceprocesses—engineering, quality, regulatory, andso on—must be validated. You don’t have to bea software engineer to do it.Validate it? I just want to use it! Soundfamiliar? Most companies in the medicaldevice industry understand and accept the needto validate software that is critical to the functioning ofa medical device. Perhaps not as widely understood oraccepted is the regulatory requirement to validate softwarethat is used to automate any process that is part of amedical device manufacturer’s quality system. Thisbroad requirement encompasses manufacturing,engineering, quality, and regulatory functions withinthe firm.The responsibility for validating such software oftenfalls to the user of the software, who knows little,if anything, about software validation. Althoughusers may not feel qualified to validate software,it is not necessarily essential to hire softwareprofessionals to validate it for them. Non-softwareengineers can validate many types of software. Thisarticle is designed to help non-software engineersunderstand what validation is, how to go about it, andhow to know which validation projects really should beleft to software-quality professionals.Some non-software engineers feel that doing softwarevalidation is wasting time. Perhaps they haveseen or been part of software testing that simplyexercises all the menu commands, and never findsany defects—ever. If validation efforts only includetesting, engineers are probably over-looking criticalvalidation activities.COMPANY PROFILEIntertech Engineering Associates, Inc.Address:100 Lowder Brook AvenueSuite 2500Westwood, MA 02090www.inea.com - (781) 801-1100Industry:(Electro)Medical -on EngineeringSkills:Product DesignRisk ManagementRequirements EngineeringElectronics DevelopmentSoftware DevelopmentSoftware Verification and ValidationProduction/Quality System Software Validation

Regulatory BackgroundFDA’s quality system regulation (QSR) that appliesto the validation of the software types discussed hereis 21 CFR 820.70(i), which addresses automatedprocesses. In addition, 21 CFR Part 11 is thecollection of regulations related to electronic recordsand electronic signatures.It is useful to look at the regulatory origins tounderstand what is law and how it differs from theguidance information that FDA produces to interpretthe law. The regulation that specifically applies to thissoftware is found in the section on Production andProcess Controls, and states“(i) Automated processes. When computersor automated data processing systems are usedas part of production or the quality system, themanufacturer shall validate computer softwarefor its intended use according to an establishedprotocol. All software changes shall be validatedbefore approval and issuance. These validationactivities and results shall be documented. 21CFR 820.70(i)”FDA is actually quite good about producing documentsto interpret and elaborate on the federal regulationsthey are charged with enforcing. The agency issueda software validation guidance in January 2002. Thisdocument, “General Principles of Software Validation;Final Guidance for Industry and FDA Staff” (commonlyreferred to as the GPSV), includes a section (Section6) that interprets this regulation1. The next step is tolearn how to apply that interpretation.What Kinds of Software Must Be Validated?To answer this question, it’s important to understandwhy the software needs to be validated. There areprecise definitions of validation and broadly acceptedactivities that lead to the conclusion that software isvalidated. But, when all is said and done, validationactivities must confirm that the software does what theuser wants it to, and that patients, users, bystanders,the environment, and the medical device company arereasonably well protected from any potential failureof the software.“Validating Software for Manufacturing Processes”as published in: Medical Device and Diagnostic Industry - May, 2006So, what software needs to be validated other than thatwhich is part of a medical device? It is often tempting tosimply conclude that all software should be validated.What is Required?As noted earlier, 21 CFR 820.70(i) requires validationof software that automates all or part of any process thatis part of the quality system. That software includes thefollowing: Software used as part of the manufacturingprocess (including software embedded inmachine tools, statistical process controlsoftware, programmable logic controllers[PLCs], and software in automated inspectionor test systems). Software used in process validation (such asstatistical calculation software, spreadsheetsetc.). Software used in design and developmentprocesses (such as CAD software, CAMsoftware, software development tools, softwaretest tools, compilers, editors, code generators,etc.). Software used to automate part of the qualityprocess (such as complaint-handling systems,lot-tracking systems, training-database systems,etc.). Software used to create, transmit, modify, orstore electronic records that are required byregulation. Software used to implement electronic signaturesfor documents required by regulation.Those are the types of software that the regulationrequires to be validated. If a device company is usingsoftware to automate a process that is required by FDA,it is essential to show that the software accurately,reliably, and consistently meets the requirements for itsintended use.Does that mean you need to do it simply because FDAsays so? At the simplest level, yes. But why is FDA sointerested in how software works? FDA isn’t so interestedpage Copyright 2006 - Intertech Engineering Associates, Inc.

in the software itself as it is in the processes that thesoftware is automating. FDA wants to be sure thoseprocesses are accurate, reliable, and consistent.If FDA is interested in a company’s processes,shouldn’t the company also be interested? If softwarevalidation reduces the risk of a failure that couldultimately result in patient harm or jeopardize theintegrity of other quality systems, then why notrequire software validation to reduce the risk ofother, non-regulated functions? Wouldn’t it be nice toreduce the risk of software failure that could disableyour company’s e-mail for a week, or shut down aproduction line for hours at a time, or delay deliveriesof raw materials, or lose track of accounts receivable?Shouldn’t the company be as concerned about thesefunctions as FDA is about those that are regulated?The point is that software validation is not just aregulatory nuisance; it is fast becoming a necessity forthe device industry’s increasingly software-controlledenvironments.What Software Should Non-Software EngineersValidate?Non-software engineers should be able tovalidate most software categorized as off-the-shelf orembedded. As its name implies, off-the-shelf softwareis purchased for a specific purpose, such as CADsoftware, compilers, or calibration-tracking software.Embedded software (or firmware) is software that ispart of a machine tool or instrument. Sometimes itmay not be obvious that an instrument is designedwith software embedded in the design. Certainly,instruments with graphic user interfaces are based onembedded software. Other instruments or tools withsimpler user interfaces may power up with a splashdisplay that briefly communicates the version ofembedded software that is controlling the display. Alarge machine tool may include many microprocessorcontrolled subsystems (and thus use embedded software).It may take some effort to even identify how manysoftware items are included in some instruments andtools. PLCs can, in general, be treated like embeddedsoftware systems.“Validating Software for Manufacturing Processes”as published in: Medical Device and Diagnostic Industry - May, 2006For all but the simplest custom software (softwarewritten for a specific purpose that is unique to acompany), validation should probably be left tosoftware development and validation professionals.Spreadsheets, macros, batch files, and similar itemscreated in house for specific purposes should all betreated like custom software, but those are usuallysmall and simple enough that they can be validated bynon-software engineers.Of course, the distinction is not always that clear.There are combinations of the above classifications.For example, many off-the-shelf software packagesrequire custom software elements in order to doanything useful. There are also custom softwaresystems that include some subelements that areeither off-the-shelf, custom developed internally,or custom developed externally. Non-softwareengineers can participate in the validation of thesecomplex systems by focusing on the system-levelvalidation for intended use, while leaving some of themore-technical verification testing activities for thesoftware development and validation professionals.To understand why the type of software makes anydifference in determining who might be capable ofvalidating it, it is important to understand thefollowing: The tools available to validate the software inthe state it is presented. The assumptions the engineer can make aboutthe state of the software when it is presentedfor validation. The objective to keep defects from getting intothe software, to find defects that are already inthe software, or to protect the company fromdefects that one simply assumes are in thesoftware.What is Validation, Anyway?First, it helps to understand what validation is not: Validation is not synonymous with testing. Validation and verificationinterchangeable terms.arenotpage Copyright 2006 - Intertech Engineering Associates, Inc.

Software verification and validation (SV&V)activities are not simply testing.FDA’s definition of validation is a good one: “Confirmationby examination and provision of objective evidencethat software specifications conform to user needs andintended uses, and that the particular requirementsimplemented through software can be consistentlyfulfilled.”1 Note the absence of the word test from thisdefinition. Testing may be one of the means that is usedto provide the objective evidence that the requirementsimplemented in software can be fulfilled, but it is not theonly means, nor is it sufficient alone.Validation comprises all activities appropriate for anengineer to come to a reasonable conclusion that a givenpiece of software reliably meets the requirements for itsintended use. Some of those activities are verificationactivities. For example, for custom-developed software,verifying that each software requirement is representedin the design is a verification activity. That activity hasprovided an additional increment of confidence that theuser’s needs (as represented in the software requirements)will be implemented because it was verified that they areproperly represented in the design. (This provides justpartial confidence; there is still plenty that can go wrongbetween design and final implementation.)Testing, too, can be a verification activity. It verifies thatthe documented design is properly implemented.Many activities can contribute to the conclusion thatsoftware has been validated. Requirements management,design reviews, and defect tracking, as well as unit,integration, and system-level testing are all techniquesavailable to software professionals during development.Many of these techniques help prevent defects fromgetting into the software during development. Riskmanagement, change control, life cycle planning, systemlevel testing, and output verification are well within thegrasp of non-software professionals. These techniquesare focused on identifying any defects that are in thesoftware, preventing defects from appearing later in thelife cycle, and planning for the inevitability that defectswill be discovered once the software is used.“Validating Software for Manufacturing Processes”as published in: Medical Device and Diagnostic Industry - May, 2006In layman’s language, validation simply gets down toanswering the following questions: What are you counting on the software to do? What makes you think that the software isworking? Can you tell when it is not working? What will you do about it if and when the softwarefails? What can accidentally cause the software to fail?Rote exercising of each menu item in a softwareapplication doesn’t fully address any of the above points.It does provide objective evidence. Unfortunately, itmay not be objective evidence that the software meetsthe requirements of the intended use, or that those needswill be consistently fulfilled.Validation Step-by-StepFor the types of software that non-software engineerscan easily validate, the validation process consists offive fundamental components: Life cycle planning. Identification of requirements for intendeduse. Identification and management of risk. Change control. Testing.Define a life cycle for the software byitemizing the phases that the software willgo throughDocumentation of the activities that support thesecomponents provides the evidence that the softwarewill meet the requirements for its intended useconsistently.Life Cycle Planning. A software life cycle is adescription of the phases that software goes throughfrom the initial concept that software might be used toautomate a process through the acquisition, installation,page Copyright 2006 - Intertech Engineering Associates, Inc.

aintenanceRetirementFigure 1.Example waterfall life cycle for developed softwaremaintenance, and eventual retirement of the software.The actual phases may differ from category to category,or from company to company. Software techies havevarious software development life cycles that are widelydescribed in the literature. Many of those life cycles donot apply to the types of software that are consideredwithin the scope of this article, because they are mostlyconcerned with the development-related phases of thelife cycle.It is helpful to consider why this is important. The basicconcept is to think about the software’s life within theorganization and to plan activities at appropriate phasesthat will contribute to the company’s confidence that thesoftware will meet the user needs consistently.Remember that validation means that the softwaremeets the requirements for its intended use consistently.“Validating Software for Manufacturing Processes”as published in: Medical Device and Diagnostic Industry - May, 2006Consequently, it is appropriate to review the validationcomponents at all or at least several phases of the lifecycle to ensure that the assumptions made early in thelife cycle are still applicable later in the life cycle.There is no single life cycle model that fits all types ofsoftware used to automate parts of a quality system.The life cycle of a spreadsheet is very different from thelife cycle of a complaint-handling system that will bedeployed in a hundred locations worldwide. Even thelife cycle of a single-use spreadsheet is different fromthe life cycle of a spreadsheet template that could beused by a number of people. The considerations withineach life cycle phase are different depending on thesoftware item, its intended use, and its intended users.This is not boilerplate. It does take some thought, butpage Copyright 2006 - Intertech Engineering Associates, Inc.

that thought can establish the foundation of validationactivities for the life of the software. This kind of activityaddresses the consistency requirement of the definitionof validation.Define a life cycle for the software by itemizing the phasesthat the software will go through. For each phase, detailthe activities to be performed to support the remainingvalidation components. This becomes the validation planfor the software. Document it. File it. Follow it.Requirements Identification. This is not as hard asit sounds. What are the intended uses of the software?Itemize them in sentences or short paragraphs. Foreach intended use, define the requirements for thesoftware to adequately meet that intended use. Usequantifiable, verifiable language to define therequirements. The following is inadequate: “Thesoftware shall control the temperature of the chamberto whatever the operator sets it to and shall get to thattemperature as quickly as possible.” This is more like it:“The software shall control temperature of the chamberwith a resolution of 0.2 C and an accuracy of 0.4 C.The software shall operate over a range of 37-120 C. Thesoftware shall drive the chamber to heat at a minimumrate of 10 C per minute. The software shall not allowthe temperature to overshoot the set point temperature bymore than 0.5 C anywhere in the operating range.”In planning requirements activities, identify therequirements early in the conceptual phases of thelife cycle. Review and revise the requirements as youevaluate competing software packages.Even in post-deployment maintenance phases, thoseresponsible for validation should also review andrevise the intended uses and requirements for thesoftware. Later upgrades and maintenance releasesof the software may introduce new features that willchange the intended use (and therefore the requirements)for the software. Account for this in the life cyclevalidation planning to indicate the need to reviewrequirements in the maintenance phase.Risk Analysis & Management. Risk analysis ispredicting, quantifying, evaluating, and controlling“Validating Software for Manufacturing Processes”as published in: Medical Device and Diagnostic Industry - May, 2006risks associated with the use of the software. Riskmanagement is the identification and design ofmethods to detect software failures and to prevent,correct, or mitigate the damage caused by suchfailures.The risk component of validation should be factoredin at several phases of the life cycle too. In the earlyconceptual phase, engineers can predict what risksmay be present from the use of the software. In laterphases, as more is known about the software and thesystem or process it controls, individual failure modesmay be identifiable. At all phases, those responsible forthe software should consider what kinds of risk controlmight be put in place to reduce the risk of harm fromfailure of the software.If control measures depend on the softwareto detect hazardous situations and to takeappropriate action these requirementsof the software should be targets oftesting in later phases of thesoftware life cycle.For example, consider software to control asterilizer. Without knowing anything about therequirements for the software, or how it isimplemented, one can readily appreciate that thereis a risk that a software failure might result in partsnot being fully sterilized. In later life cycle phases,the analysis of risks gets more detailed, and it beginsto recognize specific failure modes that mightresult in non-sterilized parts. The software maynot run the sterilizer long enough. The sterilizermechanism may become ineffective (blown fuses, out ofsterilizing chemicals, occluded input lines, occludeddrains, etc.). Will the automating software detect thesesituations and will it function properly in each case?If not, control measures should be identified to ensuresafe operation.page Copyright 2006 - Intertech Engineering Associates, Inc.

If control measures depend on the software to detecthazardous situations and to take appropriate action, theserequirements of the software certainly should be targetsof testing in later phases of the software life cycle.Sometimes software controls only one componentof a larger process. Later operations, inspections, orcross-checks in the process may verify the output of thesoftware-driven component of the process. This is oneof the best risk control measures for software failure,and it results in solid validation of the software. Thesurrounding process is verifying every output of thesoftware throughout the life cycle of the software. Thisis much more confidence boosting than a week of testingonce in the life cycle of the software. In fact, this type ofthinking, with appropriate documentation of the rationale,can even reduce the amount of testing required.One component of risk is the likelihood that a failurecan occur and result in harm. At a very high level, itis important to consider the pedigree of the software toassess the likelihood of failure. This is why customdeveloped software has so many more validationactivities associated with the requirements, design, anddevelopment phases of the software development lifecycle. These activities provide a level of assurance thatthe design and development processes were conduciveto producing high-quality software. If software isdownloaded freeware or shareware, the pedigree isunknown, and the likelihood of failure is unknown andmust be assumed to be high.activities in the later phases of the software’s lifecycle. Software professionals usually refer to theseactivities in their configuration management plans.Configuration management also includes many otheractivities related to the development of software.For the types of software considered in this article,change control is the most important component ofconfiguration management. The points to considerinclude the following: How is the validated configuration of thesoftware item identified? Document theversion, build, or time-and-date stamp of thesoftware. What else is needed for the software tooperate? Identify any other software that isrequired for the operation of the validatedsoftware item. Record the versions of any ofthese collateral software items. For example,if an engineer is validating a spreadsheet, it isessential to record the version of spreadsheetvalidated (probably the time-and-date stampof the spreadsheet file), and to record theversion information for the underlyingspreadsheet application program (e.g., Excel2003, build 11.6560.6568, service pack2). Identify which associated hardware andoperating system version levels were part ofthe validated configuration. Who is responsible for determining whenthe software can change? This is changecontrol. How will changes to the software becontrolled? Someone should be identified asresponsible for deciding when the softwarechanges, and for revalidating the softwareafter it changes. What should be done to revalidate the softwarewhen a change is made? Revalidating meansmore than retesting.Requirements andrisks need to be reevaluated to be sure theyhaven’t changed with any new features orother changes to the software. Maintenancephase changes to the software should beMany more checks and balances or testing shouldbe considered for high-risk software. If software ispurchased from a reputable supplier that is known tohave quality software used for similar purposes andknown to have a large user base, an assumption of lowrisk of failure can be rationalized.Change Control and Configuration Management.At some point—prior to deployment of the software—the software item is considered to be validated for itsintended use. How do you make sure the intendeduse, and thus the state of validation, doesn’t change?That is what must be addressed in the change-control“Validating Software for Manufacturing Processes”as published in: Medical Device and Diagnostic Industry - May, 2006page Copyright 2006 - Intertech Engineering Associates, Inc.

viewed as their own mini life cycles, as almostall validation activities of each life cycle phaseshould be reviewed, revised, and supplementedto adequately validate the new software.Testing. Testing is really a risk control measure. Riskcombines the severity of harm resulting from a failurewith the likelihood of the failure. Testing can reducethe likelihood of failure, thus reducing risk. The levelof reduction, of course, depends on the quality of thetesting. Furthermore, because the likelihood of failure isunknown before the test, and because an engineer likelydoes not have a good quantitative measure of how muchtesting reduces the likelihood of failure, it leaves anengineer with little to measure how much the testing haslowered the risk. All that is known is that some testingis probably better than no testing; and more testing isprobably better than less testing.So what can be done to increase the value of testing?First of all, use all that great thought that went into riskanalyses and risk management plans. If a companyhas risk controls in place to prevent, detect, correct,or mitigate failures in the process that is automated bysoftware, it is imperative to test them. Be sure theyreally do prevent, detect, correct, or mitigate. FDA’sGPSV guidance repeatedly calls for validation effortcommensurate with complexity and risk. Focusing testingon making sure risk control measures are effective isperhaps the best use of a test budget that is commensuratewith risk.Next, focus test efforts on areas of complexity becausethat’s where defects are likely to be found. Look forcomplex error conditions to make sure the software dealswith them properly. For example, in many softwaredriven instruments, power failure and recovery handlingare often fruitful areas of testing simply because they areoften implemented as afterthoughts. The conditions arecomplex, difficult to predict, and difficult to simulate.On the production floor, however, power failure is afact of life. Machines can destroy valuable product orsimply self-destruct because the software designers didn’tanticipate the software starting with the machine in anunexpected state. Similarly, user error or intentional“Validating Software for Manufacturing Processes”as published in: Medical Device and Diagnostic Industry - May, 2006misuse of the software is often not predicted by thesoftware developer and consequently may not behandled properly by the software.Check for conditions that could cause problems such aspressing two buttons at the same time, stuck inputs, outof-range input values. Perform operations in differentsequences to ensure that the software functions properlyin each case. Testing functionality in which defectsare suspected (i.e., error guessing) is testing budgetwell spent. Conversely, exercising menu commands(which probably have been exercised missions of timesby other users) seldom yields new defects. The besttest is one that finds a new defect.Special Situations: 100% Verifiable OutputIn certain situations, the output of a software-drivenmachine tool or software-driven process may be 100%verifiable. For example, consider a software-drivenproduction instrument that crimps connectors onto awire lead. The pull strength and conductivity of thelead are tested by a quality control (QC) test on everylead that is produced by the machine. In this case, theoutput of the software-driven machine is 100% verifiedby the QC tests on every lead ever produced. This isa much better validation of the output of the machinethan any software testing executed at a snapshot intime would ever produce.Does the software still need to be validated? Theanswer is yes. Again, validation is not synonymouswith testing. The analysis that would lead one to askthis question is, in fact, a validation activity. To ask thequestion implies that one has evaluated the intendeduse and has combined that with a risk managementplan to check the machine output for the safety-criticalattributes of pull strength and conductivity. Intendeduse and risk management are validation activities.Now consider how changes to the software arecontrolled, and how the validation state of the softwarewould need to be reevaluated when the softwarechanges. Again, this change control or configurationmanagement is a validation activity.page Copyright 2006 - Intertech Engineering Associates, Inc.

Testing in this example may be greatly reduced.If it is concluded that any possible softwaremalfunction that could affect product quality would bedetected in the QC test, then software testing for thosemalfunctions can be greatly reduced oreliminated. Some testing may still be recommended foroperator safety functions (such as emergency stops andsafety interlocks), security functions, power fail andrecovery functions, etc.Note that the testing is focused on functions relatedto intended use and safety, not on trying to reverseengineer the detailed software requirements that arethen verified in numerous and lengthy tests. Thevalidation is the collection of all of the activities thatlead to the conclusion that the software is fit for use.Documentation of the activities, the resulting logic, andany test results becomes the validation package. Takecredit for it if you do it by documenting it.ConclusionKeep two key points in mind. First, and engineerdoes not need to be a software guru to validate sometypes of software for their intended uses. Second,software validation is not synonymous with softwaretesting. Software validation is thinking rationally andsystematically about the use of the software throughoutits life cycle. Validation is establishing controls forensuring the correct operation, detection capabilities forimproper operation, backup plans for what happens ifthe software fails, and yes, some testing to ensure thatthe software and the backup plans perform as desired.Reference1“General Principles of Software Validation: FinalGuidance for Industry and FDA Staff” (Rockville,MD: FDA, 2002).“Validating Software for Manufacturing Processes”as published in: Medical Device and Diagnostic Industry - May, 2006page Copyright 2006 - Intertech Engineering Associates, Inc.

ABOUT THE AUTHOR:David Vogel is the founder and presidentof Intertech Engineering

Non-software engineers can validate many types of software. This article is designed to help non-software engineers understand what validation is, how to go about it, and how to know which validation projects really should be left to software-quality professionals. Some non-software engineers feel that doing software validation is wasting time.