Course Name Safety-Critical Systems (M)
Open Assessed Exercise 1
Deadline Time: 16.30 Date: 27th February 2020
% Contribution to
final course mark
Solo or Group ü Solo X Group X
Secure boxes beside the Student Support and
Enquires office at the entrance of Sir Alwyn Williams
Please Note: This Coursework cannot be Re-Assessed
Code of Assessment Rules for Coursework Submission
Deadlines for the submission of coursework which is to be formally assessed will be published in course
documentation, and work which is submitted later than the deadline will be subject to penalty as set out
The primary grade and secondary band awarded for coursework which is submitted after the published
deadline will be calculated as follows:
(i) in respect of work submitted not more than five working days after the deadline
a. the work will be assessed in the usual way;
b. the primary grade and secondary band so determined will then be reduced by two secondary
bands for each working day (or part of a working day) the work was submitted late.
(ii) work submitted more than five working days after the deadline will be awarded Grade H.
Penalties for late submission of coursework will not be imposed if good cause is established for the late
submission. You should submit documents supporting good cause via MyCampus.
Penalty for non-adherence to Submission Instructions is 2 bands
You must complete an “Own Work” form via
https://studentltc.dcs.gla.ac.uk/ for all coursework
Safety-Critical Systems: Open Assessment 2019-20
Supporting the Introduction of Machine Learning in
Safety-Critical Aerospace Industries (Level M)
Prof. Chris Johnson
School. of Computing Science, University of Glasgow, Glasgow, G12 8QQ. Scotland. email@example.com,
The integration of machine learning (ML) technology raises enormous questions over the design,
implementation, verification and validation, certification, operation and maintenance of safety-critical
software. Many of these approaches rely on inference and generalization from training sets that are
then used to inform interaction in less-constrained real-world environments. How can we be sure that
these training sets are sufficient to support safe and successful operation? How can humans identify
and respond to situations where there are potential failures in ML?
2 Tool Development
Your task in the open assessment is to develop a technique that will help identify and address the safety
concerns that arise from the integration of machine learning into safety-related aerospace applications
(aviation or space systems). You should begin by identifying the types of system you will consider – it
could be an autonomous drone (UAV/RPAS) or a satellite application where the data has a safety related
consequence. You need to select the system early to help focus your work. Identify existing research–
for instance, using Google scholar and similar systems. This will help you to be clear on the potential
risks that can arise from the use of ML – you may also choose to focus on specific ML approaches. The
aim is to enable senior or middle management from stakeholder organisations to assess and mitigate
the risks associated with machine learning in aerospace. Stakeholders in this context include, but are
not limited to, system manufacturers, regulators, operators.
The choice of risk assessment technique is open. You may choose to extend one of the approaches that
are introduced during this course, such as Fault Trees or Failure Modes, Effects and Criticality Analysis.
Alternatively, you may choose to develop an entirely new method. However, if you use an existing
approach you must show how it can be used with a detailed case study where ML might support a
safety-related aerospace application. The key aim is to help organizations assess the likelihood and
consequence of hazards that can arise from the integration of machine learning. The specific focus must
be on helping managers mitigate those risks by appropriate planning before an aerospace ML system is
operated outside of a test environment.
You could develop the approach using pencil and paper techniques. Alternatively, you may choose to
develop electronic tools that support the application of your risk assessment technique using any
programming methodology. The implementation of the tool could rely on simple web pages generated
using HTML, PHP or any other associated technology. However, the marking scheme will take into
account both the strengths of the design for the risk assessment technique and the effectiveness of an
implementation in terms of the support that they offer to the potential end users.
It is important that you evaluate your technique/tool for assessing the risks associated with ML in safetyrelated
aerospace systems. One means of doing this would be to ask a number of different users to try
out your risk assessment technique on a case study, exploiting an appropriate evaluation methodology.
For example, you could ask one group to use your technique and another to use an alternate approach
developed by someone else in the course. If you work in a team you MUST consider the relevant
plagiarism guidance on the School Learning and Teaching Committee web site and state the name of
the person you worked with on your submission. You must develop your reports independent of each
other. You also need to consider the level of existing expertise that the people you test will have in risk
assessment, aerospace and ML. Please consult with me before conducting your evaluation so that I can
provide advice in answering some of these questions. You should also consult the course handbook and
associated web pages that cover the ethical guidelines for user testing.
4 Transferable Skills
This exercise will provide a first-hand introduction to the challenges that face both companies and
government regulators who are working together to develop new markets for ML in safety critical
systems. There is little common agreement on the best approaches to adopt and hence you will be
working in an area of active research, which is also a focus for public, government and commercial
5 Assessment Criteria and Submission Details
This exercise is degree assessed. It contributes 20% to the total marks associated with this course. The
body of the report should not exceed fifteen A4 pages. The report must be printed out and must be
submitted in a binder (something that keeps the pages together and does not have sharp edges). It must
include: a title page containing your contact details (student number, email etc); a table of contents and
appropriate page numbers; a section on the tool that you developed; a section on the evaluation method
that you used; a results sections and some conclusions.
In addition to the fifteen pages in the body of the report, you may also include appendices. These should
contain the listing of any code used during the study together (this can be included on a CD or other
media) with suitable acknowledgements for the source of code that has been borrowed from other
programmers. The report should be handed in by 16.30 on 27th February 2020 using the secure
boxes beside the Student Support and Enquires office at the entrance of Sir Alwyn Williams
Building. Please make sure that you keep back-up copies of all of your work and submit a plagiarism
statement using the standard on-line form. The following marking scheme will be applied: 30 for the
method; 20 for the results; 30 for the conclusion; 20 for the technical documentation. All solutions must
be the work of the individual submitting the exercise and the usual lateness penalties will apply unless
I am given good reason in advance of the deadline.
You must state the title of this question on the front of your submission so I know you are answering
the level M open exercise.
You will need to do considerable reading first so please do not delay starting this assessment.
EasyDue™ 支持PayPal, AliPay, WechatPay, Taobao等各种付款方式!
E-mail: firstname.lastname@example.org 微信:easydue