Skip to main content

Quality assuring data and evidence


The model to deliver the evidence needs in the organisation requires that our functions rely on bought-in services from consultants, sharing of evidence with stakeholders such as Natural England, Joint Nature Conservation Committee, Centre for Environment, Fisheries and Aquaculture Science and other submissions. Therefore, there is a requirement for such evidence to be of sufficient high quality to support our corporate decision making.

Assessing the quality of evidence presented to us is everyone’s responsibility. We need to operate in a culture that embeds such responsibility into all individuals who deal with evidence in all its forms, such as data, data products such as maps, reports, publications.

A process for assessing the quality of the evidence available to us is required that can assist staff in this assessment. A preliminary flow process for quality assuring evidence is presented below.

Quality assuring evidence

A simple checklist is described below that included several steps. This checklist is currently under review but represents current thinking.

Quality assurance evidence process

1. Risk assessment
An initial risk assessment (RA) will need to be conducted by the member of staff receiving the evidence to quickly assess the likely risks to the organisation or to the project from incorporating weak evidence. Such RA will also highlight potential consequences to the project and/or us from the decision taken. An initial RA should use a decision tree to help staff through the evaluation process. However, a more detailed analysis may/will be required for more complex situations. Risks to the organisation from accepting or dealing with unsound evidence can be financial, reputational, legal, programmatic.

2. Prioritisation
After highlighting the potential risks to both project and organisation, a potential solution needs to be identified together with actions and individual owners of such actions. For instance, what are the immediate steps to follow if evidence is found to be unsound? What evidence or data alternatives are available? If there are no alternatives what caveats need to be included?

3. Confidence levels
We have developed a method for assigning confidence levels to data and in doing so, takes several aspects into consideration such as the provenance of the data, methods for collecting data, consistency. The primary purpose of assigning a confidence level is to highlight issues with the data quality.   

4. Categories/classes
The type of evidence being dealt with will require different approaches. For instance, data and metadata will need to be processed, assessed and logged in accordance to our approved data management process and standards. Incoming data to the organisation will need to follow the process so that it can be incorporated into the Master Data Register. Use limitations, caveats and summary information about the procedures used to create the evidence or data and ensure that these are recorded in a metadata file. The process developed for our data management provides a confidence level which will flag if there are known issues with the data. The metadata allows the user to identify these issues. The user will then be able to decide whether or not to include the data. Specific decisions will depend on factors identified in the risk assessment. Evidence in general will also need to be logged into our system by a similar process as that for data. Such process is currently under development and will follow shortly.

5. Independent/peer review
Such reviews would be critical in evaluating the quality of evidence presented to us. For instance, evidence that has been published in recognised national and/or international scientific or government journals would be welcomed. In addition to published work peer review could be sought from recognised experts for specific pieces of work submitted to us when gathering evidence or when dealing with bought services.

6. Other corroborations
These will be written assurances from key partners who are validating submissions to us. We will be working with our suppliers to ensure that their quality assurance processes are robust and compliant with the Chief Scientific Advisor Guidance on the Use of Scientific and Engineering Advice in Policy Making.

7. Stand alone pieces of work versus work as part of a series
These two types of work need to go through the same process of evidence assessment. This process needs to detail the steps to follow to ensure peer or independent review is on board and ready to do the work as soon as possible after submission to us.

8. Evidence lineage
Lineage includes, amongst other aspects, methods used in the project, any processing steps, provenance of the data/information, for instance, was the evidence provided by a key partner organisation? Was it provided by a consultant? Have we been provided with the latest versions of the evidence?

9. Training of staff
Staff in the key teams may require training in processes they handle in their business as usual operations. For instance, the licensing team require further capacity building in environmental impact assessment and appropriate assessment so that they are well versed on the latest best practices and least damaging methods used in the developments they deal with, such as dredging, piling, pipelaying. This improved knowledge will help them assess the quality of the evidence submitted in licence applications.

10. Quality assurance checklist
This checklist is an important part of the audit trail and should be completed by the member of staff conducting the assessment of the evidence. The completed checklist should also be kept in the project file for audit purposes.

Contact information

Marine Licensing Team
Marine Management Organisation
Lancaster House
Hampshire Court
Newcastle upon Tyne

Tel: 0300 123 1032
Fax: 0191 376 2681
Email: marine.consents@

Submit a licence enquiry