Automated testing and code analysis

Project Description

[read more=”Click here to Read More” less=”Read Less”] 

Automated testing and code analysis is now a standard part of software development.  Continuous integration and delivery using automated pipelines is also standard. 

 

Developers can get metrics about their code on every commit to source control.

 

This goal of this project is to associate rubrics with these metrics that will allow lecturers to devise automated marking schemes.

 

The intention is that, via a source control system, students will submit a version of code for assessment, and through a “hook”, the code will be subjected to known tests and the output of those tests will be combined with rubrics to produce a mark.

 

Ideally, the solution will support multiple languages.

Cover Page, Declaration, Acknowledgment, Title page

Content, List of Tables, Glossary

Abstract(Summer not more than 300 words)

Introduction (write last) (little reference here)

  • Motivation of the project

  – Such as For business Saving time for lecturer to make student assessment

  – Student can get basic feedback right a way from API

  – Etc…

  • whatever background the reader will need in order to understand the business need, and/or the nature of the problem (Dissertation is for non-expert So (evidence for explain concepts and context)

 

  • explaining what comes in each section (make balance between giving adequate explanation and going into too much detail)

 

Aim and Objectives

  • Clear aim

This goal of this project is to associate rubrics with these metrics that will allow lecturers to devise automated marking schemes.

  • appropriate objectives to meet this aim
  • Sonarqube code test API
  • Gitlab to call Sonarqube API and for student submit works
  • Continuous integration

Background material

  • See handbook 16

Problem

  • Business

How this software can improve lecture marks student works? and  so on.

  • Organisational

University view of this software

 

  • academic environment
  • sufficient detail to make them understand your study and what characteristics of this problem make it difficult
  • Benefits are expected to arise as a result of your study (If detailed benefits put in appendix)
  • Involve a survey of what has already been done in this field
  • Critical review of published literature

Approach

  • How you decided upon this approach
  • No detailed discussion of the mechanics of methodology
  • But contrast different approaches and explain why others were discarded

 

Application of the chosen approach

  • What I actually did actually studied
  • discussion of my research activities related to “finding out”

 

Products (see HandBook 16)

  • Requirement

Spring boot java Sonarqube API

  • Design
  • Implementation
  • Project management
  • Results

 

Analysis

  • Data mining machine learning

Which metrics generated most and least marks?

  • Reliability of your products

Conclusions (handbook 17)

  • Critically evaluate your study.
  • What can i done if i have more time
  • rule engine
  • improve functionality
  • UI
  • and etc…

Reflection/Learning (handbook 17)

References

Bibliography

Appendices

 [/read]