Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Teaching clean code
Karlstad University, Faculty of Health, Science and Technology (starting 2013), Department of Mathematics and Computer Science (from 2013).ORCID iD: 0000-0002-0107-2108
2018 (English)In: Combined Proceedings of the Workshops of the German Software Engineering Conference 2018 (SE 2018) / [ed] Krusche S., Schneider K., Kuhrmann M., Heinrich R., Jung R. et al, CEUR-WS , 2018, p. 24-27Conference paper, Published paper (Refereed)
Abstract [en]

Learning programming is hard - teaching it well is even more challenging. At university, the focus is often on functional correctness and neglects the topic of clean and maintainable code, despite the dire need for developers with this skill set within the software industry. We present a feedbackdriven teaching concept for college students in their second to third year that we have applied and refined successfully over a period of more than six years and for which received the faculty's teaching award. Evaluating the learning process within a semester of student submissions (n=18) with static code analysis tools shows satisfying progress. Identifying the correction of the in-semester programming assignments as the bottleneck for scaling the number of students in the course, we propose using a knowledge base of code examples to decrease the time to feedback and increase feedback quality. From our experience in assessing student code, we have compiled such a knowledge base with the typical issues of Java learners' code in the format of before/after comparisons. By simply referencing the problem to the student, the quality of feedback can be improved, since such comparisons let the student understand the problem and the rationale behind the solution. Further speed-up is achieved by using a curated list of static code analysis checks to help the corrector in identifying violations in the code swiftly. We see this work as a foundational step towards online courses with hundreds of students learning how to write clean code.

Place, publisher, year, edition, pages
CEUR-WS , 2018. p. 24-27
Series
CEUR Workshop Proceedings, ISSN 1613-0073 ; 2066
Keywords [en]
Codes (symbols), Knowledge based systems, Software engineering, Teaching, Functional correctness, Learning programming, Programming assignments, Quality of feedbacks, Software industry, Static code analysis, Static code analysis tools, Students learning, Students
National Category
Other Computer and Information Science Didactics
Identifiers
URN: urn:nbn:se:kau:diva-67084Scopus ID: 2-s2.0-85044541080OAI: oai:DiVA.org:kau-67084DiVA, id: diva2:1198875
Conference
SE-WS 2018 Software Engineering Workshops, 6 March 2018, Ulm, Germany
Available from: 2018-04-19 Created: 2018-04-19 Last updated: 2018-05-16Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

ScopusFulltext

Authority records BETA

Lenhard, Jörg

Search in DiVA

By author/editor
Lenhard, Jörg
By organisation
Department of Mathematics and Computer Science (from 2013)
Other Computer and Information ScienceDidactics

Search outside of DiVA

GoogleGoogle Scholar

urn-nbn

Altmetric score

urn-nbn
Total: 3 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf