Meeting Minutes

INSPIRE Validation and conformity:

Meeting of MIWP-5 sub-group and ARE3NA 2 contractor

Project: / Action 1.17: ARE3NA / Meeting Date/Time: / 16/12/2015
9h30-17h00
Meeting Type: / Kick-off meeting / Meeting Location: / Ispra
Meeting Coordinator: / ML / Issue Date: / 11/02/2016
Attendee Name / Initials / Organisation / Email
Carlo Cipolloni / CC / ISPRA /
Christian Ansorge / CA / European Environment Agency /
Clemens Portele / CP / Interactive Instruments /
Daniela Hogrebe / DH / GDI-DE /
Emidio Stani / ES / PwC EU Services /
Freddy Fierens / FF / JRC /
Giacomo Martirano / GM / Epsilon Italia /
Ilkka Rinne / IR / Spatineo /
Jon Hermann / JH / Interactive Instruments /
Lorena Hernandez Quiros / LHQ / JRC /
Michael Lutz / ML / JRC /
Robert Tomas / RT / JRC /
Robin Smith / RS / JRC /
Simon Vrecar / SV / JRC /
Stefania Morrone / SM / Epsilon Italia /
Stijn Goedertier / SG / PwC EU Services /
Sven Boehme / SB / GDI-DE /
Other JRC staff
Meeting agenda
1.Implementation of INSPIRE validator
2. Work to date in MIWP-5 sub-group on validation and conformance testing
3. Discussion on use cases and architecture
Accompanying slide decks: https://ies-svn.jrc.ec.europa.eu/projects/validation-and-conformity-testing
Topic / Summary /
1.  Implementation of INSPIRE validator / In the context of Action 1.17 of the ISA Programme (ARE3NA Action), PwC and interactive instruments (ii) have just started a new project to create a reusable open-source reference tool for conformance testing for INSPIRE.
The ISA Programme is also undertaking another action on interoperability testing, in the context of wider eGovernment activities (ISA action 4.2.6).
The overall tasks and deliverables were discussed. They consist of the following:
·  Provide comments on and finalise the Abstract Test Suites (ATSs);
·  INSPIRE test framework design;
·  INSPIRE test framework implementation;
·  INSPIRE Executable Test Suites (ETSs) implementation; and
·  GITB integration.
The focus of the meeting is a discussion of the INSPIRE test framework design and the requirements and use cases driving the design.
The ATS, currently hosted on GitHub are composed of a table for requirements coverage and another for test scenarios. The structure of the ATSs, could be extended, for example by including meaningful error messages.
Test framework design: The approach for making the design decisions would consist of the following steps:
1. Collect and prioritise the requirements: there is already a list of initial list of requirements.
·  List of test objects: metadata record, network service, dataset, spatial data service.
·  Test result requirements.
·  Test project requirements.
·  Test architecture requirements.
·  Licensing (it must be release under an open source licence).
2. Analyse existing solutions and testing frameworks.
3. Design a proposal for an INSPIRE testing framework
The INSPIRE Maintenance and Implementation Group (MIG) will be involved when gathering requirements for the conformance testing.
The starting assumption in the project for the initial phase is not to build a single generic test engine allowing to issue a single request and to receive a harmonised test report. The objective is to reuse existing operational validators in the INSPIRE validation framework as much as possible and focus new developments on identified gaps, e.g. testing of data sets. Any other approach would lead to duplication of efforts. JRC remarked that this may be contrary to the user requirement to have single reference validation results, driven by the reference ATS and ETS developed in the framework of the MIWP-5 activities.
RS remarked that the project must also include ISA action 4.2.6 in the discussion. Furthermore, the implementation process must align with the CEN GITB workshop agreements at the start of the activity.
Concerning the architecture, the initial requirements include a need for a web application (front end) and API which could be used by developers.
Furthermore, the architecture must be modular as different ETS could be added in the future.
The ETS reports should be sufficiently detailed and be both human (HTML) and machine (XML, JSON) readable. It is important to define the governance rules of the test results by thinking first where they will be stored and how they will be accessed.
Some open question that come up are about the ETS language format, which will be most probably different case by case, and the versioning of the rules.
The implementation should start with something small that works and that considers, at the beginning, different test objects. Considering the limited resources, there should be a priority list in order to adjust the timeline to speed up the development of the ETS; priorities should be determined by MS as well. The development should be driven by the ATS, the approach would be focusing first on those have not been implemented yet.
A question was raised whether the ETS would be a black-box approach. It was agreed that it is definitively not a black box approach as all the tests will be available online in editable “source code” form under a license supporting reuse.
A driver for consolidating the current situation with multiple validators (ETSs) for the same test objects is that currently it is common that these validators (ETSs) give different results. An example are the metadata validation in the INSPIRE Geoportal and in validators in the Member States. There is a need for a reference implementation that can be trusted. It is the hope that an agreement on the ATSs will provide a consensus on what needs to be tested. At the same time it is clear that no ATS or ETS can fix vague, unclear or ambiguous requirements. This needs to be clarified in the ATS review, the test framework design and discussed with the MIG-T.
By releasing the framework – and the ETSs – as open source, MSs and others will be able to download it, extend and/or edit it and run the tests locally. This also adds requirements regarding deployment, packaging and interoperability.
Taking the requirements into account, the design should focus on maintainability, changeability and stability.
Regarding maintenance, the question was raised on who will maintain the test framework and the INSPIRE-specific ETSs. If the MIWP-5 is asked to maintain the test framework, it should not be confronted with design choices it does not feel comfortable with.
The framework should provide a common basis upon which future developments can be done.
A preliminary conclusion during the discussion was that the design should allow the reuse existing validators, but only to the extent where these validators could become part of the “reference implementation”.
A desired feature should be scheduling test runs, e.g., the GDI-DE test suite provides this feature. Some useful links are:
·  http://testsuite.gdi-de.org/gdi/
·  https://wiki.gdi-de.org/display/test/GDI-DE+Testsuite
Planning: there was a concern about the planning. The contractor should clarify in the test plan what will be available within 18 months. For example, it would be good to know whether the ETS for the Annex I data specifications would be available, as this would be aligned with the deadlines in the INSPIRE roadmap. This is agreed, but the plan can only be developed once requirements, design and priorities are stable. I.e., planning will be an incremental process, too.
The question was raised how the choice will be made for specific ATSs to be implemented. The JRC will take a decision, where needed, after consultation with the MIG. The priority setting should come from the Member State representatives. As an initial approach to priority setting, it was agreed to address a representative sampling of all types of tests (metadata, network service endpoints, data sets, spatial data service endpoint), rather than concentrate on the completeness of the individual cases (e.g. completing all the metadata ETS before addressing the other types of tests).
The objective of testing is to help LMOs to meet the legal requirements, but also to promote interoperability and give confidence to suppliers and users of data and services that they are interoperable.
2.  Work to date in MIWP-5 sub-group on validation and conformance testing / MIWP-5 sub-group on validation and conformance testing
CC provided an overview of the activities of the MIWP-5 sub-group.
Useful link: https://ies-svn.jrc.ec.europa.eu/projects/validation-and-conformity-testing
MIWP-5 Use Cases
CA presented his work done on defining use cases (defined as user stories) starting by the definition of the possible roles.
Useful link: https://ies-svn.jrc.ec.europa.eu/issues/2611
IR said that he refined the user stories into more detailed use cases. A first-version of this work is available here:
https://ies-svn.jrc.ec.europa.eu/projects/validation-and-conformity-testing/wiki/Validation-related_use_cases
The use cases presented by CA were not yet synchronized with those in the wiki.
Furthermore, the role/activities of data providers have been highlighted, including fulfilment of deadlines, sharing experience of validation problems with their data and if other countries have similar problems.
MIWP-5 analysis of existing tools/platforms/approaches/languages for conformance testing
DH gave an overview of existing tools/platforms/approaches/languages for conformance testing.
There existed at the time of analysis at least 22 tools, 13 operational of which 7 from MS, 8 tools multilingual. To be useful for MSs, the framework should have multilingual support.
Remote API are supported in at least 2 cases: REST GDI-BE test suite, INSPIRE geoportal pilot validation.
An update of the overview table is planned, once it is clear which additional information is relevant for the test framework design.
Useful link: https://ies-svn.jrc.ec.europa.eu/projects/validation-and-conformity-testing/wiki/Overview_about_existing_validation_toolssolutions
INSPIRE Data validation: the eENVplus experience
SM presented the work that was carried out in the eENVplus project on building a validator for a number of INSPIRE data specifications.
Useful links:
·  http://showcase.eenvplus.eu/client/validation.htm
·  http://cloud.epsilon-italia.it/eenvplus_new/ATS.htm
Requirements for software/test development
IR presented potential requirements for the test framework.
Among others, he made it clear that the testing framework should be sufficiently generic, such that the MIWP-5 could further maintain it.
Useful links:
·  https://ies-svn.jrc.ec.europa.eu/projects/validation-and-conformity-testing/wiki/INSPIRE_testing_framework_-_concepts
Other useful links:
http://cite.opengeospatial.org/teamengine/
https://github.com/opengeospatial/teamengine
http://cite.opengeospatial.org/teamengine/about/gml32/3.2.1/site/
https://github.com/opengeospatial/ets-gml32
http://elfproject.eu/documentation/geotool/etf
3.  Discussion on use cases and architecture / A discussion was held in 2 subgroups and summarised by CP in the “Discussion” slide (see slide deck for more information). The main items that should be addressed in the design are listed here, too:
§  Primary target stakeholders are the Data and Service Providers (related to their implementation obligations)
§  User requirements with priority
§  Control what I want to test: Select ETS (specific version), conformance classes
§  Control where I want to test: Central deployment or in my own environment
§  Get informed about the test progress / results in useful way
§  As part of the test result, get proposed text for copying to the metadata of the test object
§  Use the tests in my production and publication workflow (API, scripting)
§  Add additional tests (for extensions, profiles)
§  Architecture requirements with priority
§  Document model of ATS, ETS, test objects, test results, etc.
§  Need to support different types of tests, potentially with different components
§  Data tests (XML potentially very large)
§  Metadata tests (XML, smaller, could be together with the Network Service tests)
§  Network service tests (Web Service)
§  Focus on general conformance, not QoS
§  Need to distinguish test project scope
§  Tests based on the test object alone (“validation test”)
§  Tests involving referenced resources (“interoperability test”)
§  The framework should be based on a generic test engine that can process executable tests specified in one or more ETS languages/formats
§  It should be possible to interact with the test engine through a GUI and through an API
§  Multilingual support
§  Requirements with priority regarding test objects
§  Metadata records
§  Download services
§  Spatial objects in Annex I data themes
§  Requirements with priority regarding test results
§  The test engine shall return a detailed test report in human-readable (e.g. HTML) or machine-readable format (e.g. XML, JSON)
§  Multilingual support (note: not multi-lingual ETSs)
§  Timestamp
§  Requirements with priority regarding test projects and their management
§  The ETS language/format should allow easy modification of rules and documentation of a version history
§  It should be possible to validate against specific versions of ETSs (based on version of the CC/TG as well as of the ETS)
§  Focus on requirements (fail = error), not recommendations (fail = warning or info)
§  ETS rules should have links to the ATS rules they implement and requirements they test
§  The test engine and ETS languages/formats should be open source
The collaboration and how the MIWP-5 team could best support the development will be discussed in the MIWP-5 January web-meeting.

Decision and actions

Decision Nr / Decision Description / Date of decision taken / Decision Owner
1 / A web meeting organized by CC (Doodle) will be held in January / 2015-12-16 / CC
Action Nr / Action description / Target Resolution Date / Action Owner
1 / interactive instruments to propose conceptual model for the INSPIRE validator (test framework) and structure requirements with their priority based on the discussion / 2015-01-31 / CP
2 / PwC to see whether existing standards (CEN GITB Phase 3 CWAs, OASIS TAML, ETSI TDL) could be used for describing tests and test reporting / 2015-01-31 / ES
3 / JRC/PwC/ii to refine planning based on discussion results / MIWP-5 web-meeting / ML