Multiple Verification Methods for SysML Requirement

Please share any best practices for associating multiple verification methods to a SysML requirement.

My current train of thought is to associate verification requirements, each stereotyped as <<extendedRequirement>>, with a VerifiedBy relationship from the requirement being verified. The Verification Method (e.g. Test, Inspection, Demonstration, Analysis) would be set in the <<extendedRequirement>> elements that verify the system requirement.

Alternatively, I could build a custom extension to the SysML profile for a "verification requirement", tagged with a verification method (e.g. Test, Inspection, Demonstration, Analysis), which would be associated to the requirement being verified using the VerifiedBy relationship.  Although I'd rather not build any custom extensions.

  • Model-Based Test is an interest of mine and a domain in which I have some practical experience as a technical leader.

    Not speaking critically, but your expressed thoughts leave quite a bit of undefined space to your Test Engineering practice, so I'll make some assumptions. I tend to shy away from the idea of "Best Practice" when discussing MBSE, as most practitioners lack craftsman level competency in the practice and I'm not convinced of the existence of 'Universal Best Practice'. Engineering strategy, driven principally by risk tolerance, schedule, and cost, for a particular project is more likely the decision driver of the right practice for the project.

    First: I'm assuming you are using MagicDraw or Cameo because you refer to «extendedRequirement», which is a non-normative stereotype presented in the OMG SysML specification, but it is NOT in the OMG's SysML XMI. The «extendedRequirement» stereotype is NON-NORMATIVE and therefore, in my opinion, "custom" in the context of MD/CSM. The MD/CSM implementation does not specify Multiplicity for the "verifyMethod" tag definition and the default "<undefined>" multiplicity behavior of MD/CSM is only a single value specification is supported. You should closely evaluate the implementation of the properties of the «extendedRequirement» stereotype and take note of its variance from the implementation of the OMG standard stereotype «abstractRequirement» properties for multiplicity and visibility. Your organization should also consider the justification of using the «extendedRequirement» stereotype to denote the requirement taxonomy semantic of "verification requirement". I suggest this creates ambiguity in your requirement taxonomy and should be avoided.

    Secondly, If your organization intends to model a "Verification Requirement" for each system requirement, then your organization should strongly consider adding a configuration-managed custom profile to support the practice. Elaboration of the verification method is also essential. Merely stating the verifyMethodKind is insufficient. I subscribe to the thinking of A. Wayne Wymore in this regard (see: A System Theoretic Framework for V&V). I paraphrase: How can it be proven that the as-built system meets stakeholder expectations? What is the set of objective evidence that provides proof?

    That said, I also suggest that there are many of us in the Test Engineering community that question "multiple verification methods" for a properly written black-box functional or non-functional requirement. I'm aware that many organizations misuse the "verification method: analysis" (as defined in ISO-29148). As stated in ISO-29148 "analysis" does NOT mean to process collected data from stimulating and observing the realized system. This is a discussion for a different medium. The OMG SysML specification implies an expansive scope of verification method = analysis.

    I have many more thoughts on this topic, but I'll stop here.
  • In reply to Geoffrey Shuebrook:

    Thanks Geoffrey. Our organization's standard practice is to elaborate the verification method (versus just stating the verification method) via a "verification requirement". In my opinion, the verification method should be associated with the verification requirement (versus the requirement being verified) since a requirement may need to be verified by a combination of multiple verification methods. From your response, it sounds like a configuration-managed custom profile might be the way to go. Thank you for the references as well.
  • In reply to Mike Brazinski:

    Mike, Glad to hear your organization elaborates the verification method. IMHO, that is a required practice. That is my experience with specifications written in the '70s & '80s. Fore every requirement in Section 3 there was a method elaborated in Section 4.

    I'm not convinced that Black-Box Testing of both functional and performance requirements ever requires multiple verification methods (ISO-29148 semantics) to produce objective evidence of conformance. I do agree that it may require 1..* test cases and/or test configurations and/or test data (vectors) (UML Testing Profile v1.x semantics) to produce said objective evidence.

    But this is a complex subject and we each may have our own lexicon for the test domain, which makes a discussion in a text based chat exchange challenging. I am not challenging your assertion, I just don't have an appreciation of your particular semantics and therefore your rationale behind the requirement.

    Have you evaluated the UML Testing Profile v2.x? The UTP is heavily influenced by the goals of Black-Box SW test and European Test organizations, but you may find value in adopting a standard rather than creating a custom profile. I advocate the use of standard modeling languages/profiles over proprietary and localized domain languages. UTP v2.1 spec on pg 25 & 26 discusses the semantic of "test requirement" and "test objective". You may be interested in the perspective.

    NAVAIR might posses the influence to get MD/CSM to properly support the OMG UTP v2.x specification. Best of Luck!
  • In reply to Geoffrey Shuebrook:

    I whole heartedly agree with everything you said. What is the community level of embracing PBR <abstractRequirement> instead of <<Requirement>> or that <<extendedRequirement>> non-normative extension?. We have a few programs that have embraced PBR, instead of the DOORS/ReqIF compatible way that some tools support via profiles on top of <<Requirement>> but not PBR with ReqIF... We're still looking into if and how to incorporate UTP2, but still have a ways to go with Test community adoption, with only a few Navy programs even attempting to model that domain so far, does anyone else know of actual adoption examples??. Like you said, it's a collection of best practices based on risk, cost, schedule driven by program office decision making use cases intended to be supported through modelling.
  • In reply to Jim Ciarcia:

    Have you heard any more about the implementation of UTP2 at NAVAIR? We've put together a T&E Profile to maintain test planning and requirements for our program. However, I'd like to make it integrate better both with the system model and with external platforms. From what I have looked at the DT/OT terms and elements match up quite well with UTP2. (with the exception of risk) So, following that standard might be a good way to go. It wouldn't take too much extending and specializing to make it compliant with UTP2. It would be great to know the status of getting aspects of test into the SE models if anyone can provide an update on that.
  • In reply to Trisha Radocaj:

    Ms. Radocaj, I am external to NAVAIR. Last year I constructed a model documenting the UTP V2.1 profile. The project was not subjected to review as the UTP v2 was deemed "Too Complex" by the I&T organization. The model should be subjected to review and correct non-conformances to the OMG UTP v2.1 profile specification. If this would be helpful to NAVAIR, please feel free to employ the profile model.

    {DRAFT} UML Testing Profile v2.1 May 2020

  • In reply to Geoffrey Shuebrook:

    Mr. Shuebrook - This is really great. You're about 100 steps ahead of me. I loaded the UTP v2.1 from OMG's XMI files into MagicDraw (which I thought was no small feat) and just had the definitions of terms added, but what you've put together is much much cleaner and more descriptive. I would like to make the cross-walk between the UML-based test profile and the SysML profile for a system model. For example - system <<block>> may be a <<testitem>> or a <<testcomponent>> (in addition to others). I'd also like to map what we've done with items of DT & OT to the test profile stereotypes they would specialize. This helps tremendously. Thank you much for sharing this. I hope to hear some other responses on the status of others who have looked at the UTP.

    PS. Your Comment and Vocabulary Profiles also look like they can be very useful.
  • In reply to Trisha Radocaj:

    Trisha - Feel free to use and adjust to fit your needs and your organization's policies. I also have a bibliography profile in support of a controlled vocabulary development. There is also an alternate vocabulary profile. Does your organization embrace the constructs of ISO 15288 (e.g., A test system is an enabling system)? If yes, consideration should be given to a separate model for the test system and its system of tests supporting integration, verification, and validation of the system under development. If you desire to contact me directly, my email address is in my APAN profile.
Related