Computer Science Report Abstracts


Report Series A
Report A/2006/3
Risto T. Honkanen
Nearly-All-Optical Routing in Sparse Optical Tori

In this paper we present an all-optical network architecture and a nearly-all-optical router for it. The sparse optical torus network consists of an nxn torus, where processors are deployed diagonally. Addresses of packets are encoded and recognized by using fiber Bragg grating arrays. The optical address recognition ensures that only a few logical gates are needed to implement routing decisions at the routing nodes.

Keywords: sparse optical torus, optical communication, fiber Bragg grating
Classification (ACM CCS 1998): F.2.2, F.4.3, I.7.2


Report A/2006/2
Pekka Kilpeläinen, Rauno Tuhkanen
One-Unambiguity of Regular Expressions with Numeric Occurrence Indicators

Regular expressions with numeric occurrence indicators are an extension of traditional regular expressions, which let the required minimum and the allowed maximum number of iterations of subexpressions be described with numeric parameters. We consider the problem of testing whether a given regular expression E with numeric occurrence indicators is 1-unambiguous or not. This condition means, informally, that any prefix of any word accepted by expression E determines a unique path of matching symbol positions in E. The main contribution of this paper is a polynomial-time method for solving this problem, and a formal proof of its correctness.

Keywords: regular expression, numeric iteration, interval expression, one-unambiguity, XML Schema, unique particle attribution
Classification (ACM CCS 1998): F.2.2, F.4.3, I.7.2


Report A/2006/1
Keijo M.J. Haataja
Security in Bluetooth, WLAN and IrDA: a comparison

Excluding mobile phone related data transfer, there are three popular wireless data transfer technologies: Bluetooth, WLAN and IrDA. In this report we compare the security of these three technologies.


Report A/2005/2
J. Nieminen
Efficient implementation of Unicode string pattern matching automata in Java

We study different efficient implementations of an Aho-Corasick pattern matching automaton when searching for patterns in Unicode text. Much of the previous research has been based on the assumption of a relatively small alphabet, for example the 7-bit ASCII. Our aim is to examine the differences in performance arising from the use of a large alphabet, like the 16-bit Unicode that is widely used today. The main concern is the representation of the transition function of the pattern matching automaton. We examine and compare array, linked list, hashing, balanced tree, perfect hashing and triple-array representations. For perfect hashing, we present an algorithm that constructs the hash tables in expected linear time and linear space.

We implemented the Aho-Corasick automaton in Java using the different transition function representations, and we evaluate their performance. Triple-array performed best in our experiments, with perfect hashing, hashing and balanced tree coming next. We discovered that the array implementation has a slow preprocessing time when using the Unicode alphabet. It seems that the use of a large alphabet can slow down the preprocessing time of the automaton considerably depending on the transition function representation used.

Keywords: string pattern matching, Aho-Corasick, implementation, transition function


Report A/2005/1
N. Päivinen, T. Grönfors (editors)
Proceedings of the 2005 Finnish Signal Processing Symposium (FINSIG'05)

This page contains the full papers presented at the 2005 Finnish Signal Processing Symposium, FINSIG'05, held on August 25, 2005, in University of Kuopio, Finland. The FINSIG symposiums are meant for young researchers, both graduate and post-graduate students, specializing in signal processing. Topics covered in the symposium include, for example, communications signal processing, fuzzy methods, image processing, filter design, medical image and signal processing, neural networks for signal processing, nonlinear signal processing, and speech processing.


Report A/2003/4
M. Rönkkö
Previsualization in Robotics: An Atomic Approach

In this paper, we discuss previsualization in robotics. Previsualization is a design technique used in film industry. In previsualization, a computer animates a highly underspecified scene; thus, helping the designers to detect missing components from the scene.

Previsualization can also be used in robotics for inspecting a highly underspecified model. However, for previsualization to work, it should reveal detailed dynamics not explicitly specified in the model. The details should appear as emergent dynamics. In robotics, in particular, the emergent dynamics should cover the interaction dynamics of the physical components. Such a requirement is non-trivial.

The contribution of this paper is an investigation of an atomic approach that supports previsualization in robotics. The approach is based on use of atoms. Atoms obey simple, compositional interaction laws. The laws produce emergent interaction dynamics for physical components composed of atoms. We shall also illustrate in this paper how atoms, despite their simplicity, can capture compactly non-trivial collision of non-rigid strings, and how the emerging dynamics shows nevertheless also intricate details.

Keywords: robotics, previsualization, emergent dynamics


Report A/2003/3
R. Honkanen
Systolic Routing in an Optical Butterfly

In this paper we present an all-optical network architecture and a systolic routing protocol for it. The r-dimensional optical butterfly (OBF) network consists of r2r nodes and r2r+1 edges. Processors are deployed at the level 0 (identical to level r) nodes of the network. Routing is based on the use of a cyclic control bit sequence and scheduling. The systolic routing protocol ensures that no electro-optical conversion is needed in the intermediate routing nodes and all the packets injected into the routing machinery will reach the target without collisions. A work-optimal routing of an h-relation is achieved with a reasonable size of h in Ω(n log n).


Report A/2003/2
R. Honkanen
Systolic Routing in an Optical Fat Tree

In this paper we present an all-optical network architecture and a systolic routing protocol for it. An r-dimensional optical fat tree (OFT) consists of 2r-1 routing nodes and n=2r processing nodes deployed at the leaf nodes of the network. In our construction packets injected into the OFT carry no routing information. Routing is based on the use of a cyclic control bit sequence and scheduling. The systolic routing protocol ensures that no electro-optical conversion is needed in the intermediate routing nodes and all the packets injected into the routing machinery will reach their target without collisions. A work-optimal routing of an h-relation is achieved with a reasonable size of h in Ω(n log n).


Report A/2003/1
P. Kilpeläinen, N. Päivinen (editors)
Proceedings of the Eighth Symposium on Programming Languages and Software Tools

This volume contains the full papers presented at the Eighth Symposium on Programming Languages and Software Tools, SPLST'03, held June 17 - 18, 2003, in Kuopio, Finland. The papers were selected by the program committee out of 25 which were submitted as a response to the call of papers. In addition, an invited talk was given by Prof. Martti Penttonen on the topic "How to program a parallel computer?", and two submitted papers were included in the program as short presentations.

The objective of the Fenno-Ugric Symposium on Programming Languages and Software Tools is to provide a forum for the presentation and discussion of recent research and development by software scientists. The SPLST series of conferences arose in 1989 from the cooperation of Finnish and Hungarian universities. Since then the symposium has been organized every two years, with participants coming from various institutes and universities of Estonia, Finland and Hungary.


Report A/2002/4
J. Stoll
The Hardest Context-free Language Revised

The Hardest Context-free Language is one context-free language (cfl) from which any cfl may be obtained by an inverse homomorphism; S. Greibach (1973). One key of proving that the Hardest Context-free Language exists is that every context-free language L (cfl) can be generated by a cfg G in Greibach Normal form (GNF) so that L=L(G); S. Greibach (1965). We present a new proof of the well-known result using the new method of transforming a context-free grammar (cfg) into 2-Greibach Normal form presented by N. Blum and R. Koch (1999). Revising the proof of the existence enables us to predict the length of pairs of sentential forms. The result is that a pair of sentential forms organized in blocks given by the definition of the Hardest Context-free Language have an upper bound and a lower bound in their length.

Classification: F.4.3


Report A/2002/3
M. Ek, H. Hakkarainen, P. Kilpeläinen, T. Penttinen
Perinnedatan automaattinen muotoilu XML-tekniikoin

XML-tekniikat tukevat dokumenttitiedon moninaiskäyttöä: eri esitysmuodot tulostusta ja digitaalista mediaa varten voidaan tuottaa samasta XML-muodosta. XML-tekniikoiden soveltaminen edellyttää perinnedatan muuntamista XML-muotoon. Tähän on olemassa menetelmiä, mutta vastaavaa muotoilumääritysten muuntamista XML-muotoon ei juurikaan ole tarkasteltu. Esimerkiksi XSL on voimakas XML-dokumenttien muotoilukieli, mutta kymmenien erilaisten elementtityyppien käsittely vaatii silläkin kymmenien vastaavien muotoilusääntöjen kirjoittamisen. Mikäli perinnedataan on sovellettu eksakteja muotoilumäärityksiä, datan XML-muodolle olisi hyvä pystyä tuottamaan niiden perusteella muotoilu mahdollisimman automaattisesti. Tarkastelemme XML-tekniikoiden sovellusesimerkkinä muotoiltuna tulostettavan riviperustaisen aineiston muuntamista XML-muotoon. Esitämme tätä varten kehittämämme arkkitehtuurin, joka muuntaa ja muotoilee annetun aineiston sen rakennetta ja muotoilua kuvaavan kontrollitiedoston ohjaamana automaattisesti. Arkkitehtuuri perustuu käsiteltävien tiedostojen XML-muotoon konvertointiin kehittämällämme "XML-käärintäkielellä" nimeltä XW. Kerromme myös järjestelmän toteuttamisesta saaduista XML-tekniikoiden soveltamiskokemuksista.


Report A/2002/2
M. Ek, H. Hakkarainen, P. Kilpeläinen, T. Penttinen
Declarative XML Wrapping of Data

XML provides a standard technology for archiving information and for transferring it between co-operating systems as well-formed documents. Translation of legacy data to an XML-based representation, often called "XML wrapping", is a recurring practical problem in the utilization of XML. We attack this problem by describing a declarative XML wrapper description language called XW (XML Wrapper). XW is designed to be a convenient language for describing typical XML wrapping of data. The design of the language is influenced by a number of XML technologies, such as XML Namespaces, XML Schema, and XSLT. We are currently applying XW for automating the conversion of medical messages and mass-printing material to XML documents. We discuss the implementation techniques and principles of executing declarative XW wrapper descriptions. The XW implementation provides a SAX (Simple API for XML) interface, which means that it can be easily and efficiently used within other applications to access non-XML data sources as if they were XML documents seen through an XML parser.


Report A/2002/1
T. Toroi, A. Eerola, J. Mykkänen
Testing business component systems

In this paper we present an effective and practical method for testing business component systems step by step. We utilize components of different granularity levels. The advantages of component-based systems are the possibility to master development and deployment complexity, to decrease time to market, and to support scalability of software systems. Also the great number of dependencies which occur in object-oriented approach can be mastered better, because majority of the dependencies between classes remain inside one component where the number of classes is much less than in the total system or subsystem. The abstraction levels decrease the work needed in testing, because the testing work can be divided into sufficiently small concerns and the previously tested components can be considered as black boxes, whose test results are available. Furthermore, errors can be easily detected, because not so many components are considered at one time.

In our method, components of different granularities are tested level by level. The idea of the method is that at each level white box testing and black box testing occur alternately. We define test cases based on component granularities at distributed component, business component and business component system level. Test cases are derived from use cases or contracts. We use a dependency graph, which shows dependencies between the same granularity components. The dependency graph is used to assure that the whole functionality of the component has been covered by test cases.


Report A/2001/3
U. Timoshkina, Yu. Bogoyavlenskiy, M. Penttonen
Structured Documents Processing Using Lex and Yacc

This report studies the applicability of the general purpose programming language compiler tools and web browsers for processing structured documents. In particular, the scanner generator lex and the parser generator yacc together with the compiler gcc are used for transforming the document structure, while the browser Netscape is used for browsing and creating the contents of the structured documents. As an application, a work plan is developed as a structured document. It is assumed that the reader is familiar with the C programming language.


Report A/2001/2
M. Ek, H. Hakkarainen, P. Kilpeläinen, E. Kuikka, and T. Penttinen
Describing XML Wrappers for Information Integration

XML-based data formats are actively being developed for standard-based exchange of data between heterogeneous co-operating systems. For this end we need transformation programs called wrappers, which are able to expose system-specific data using the chosen XML-based representation. Wrappers can be written using practically any general purpose programming language but ad hoc solutions are tedious both to develop and to maintain. To alleviate the problem of implementing XML wrappers we introduce a simple yet powerful wrapper specification language called XW. Using XW one can describe the structure of serialized input data through a simple declarative specification, which acts as a template for the automatic generation of a corresponding structured XML representation. We introduce the language through example specifications of real-life wrappers. We also sketch its implementation principles, emphasizing the use of standard XML techniques.


Report A/2001/1
E. Kuikka, A. Eerola, J. Komulainen
Structuring the electronic patient record

The paper presents a methodology for the structure definition of the electronic patient record. A patient record is typically a document which is updated by many users, required to be done in many different layouts, transferred from one place to another and archived for a long time. For these reasons, it is a very good candidate to be processed as a structured document and coded using SGML or XML document standards.

The design process starts at the most important issue as a goal to understand the work done in the hospital in such a level which makes it is possible to describe the information flow of the patient record and the content, structure and the relationships of the data needed in the treatment of the patients. This phase presents the information flow using data flow diagrams and the conceptual modeling of the content using object-oriented UML class diagrams. Next, the SGML/XML structure definition (Document Type Definition, DTD) is generated from the UML class diagrams using a set of correspondence rules.

The method gives a pragmatic way to design the structure of the electronic document. The results can be utilized for searching the possibilities to standardize the content of the electronic patient record.


Report A/1999/7
E. Kuikka, P. Leinonen, M. Penttonen
An approach to document structure transformations

We characterize a class of structure transformations, called dense, hierarchic and local transfromations, that can be efficiently implemented by a two-phase, semi-automatic procedure. In the first phase of our method, corresponding substructures are searched by an interactive procedure. In the second phase, the replacement of substructures is automatized by generating a tree transducer implementing it.


Report A/1999/6
E. Kuikka, A. Eerola, A. Miettinen, J. Porrasmaa, M. Ek, J. Komulainen
An object-oriented method to create an SGML DTD of an electronic patient record

A patient record is typically a document which is updated by many users, required to be represented in many different layouts, transfered from a place to another, and archived for a long time. It is also an object for various types of queries. Thus, it is a very good candidate to be represented as a structured document and coded using SGML document standard. An SGML based system requires that the structure of the document is defined in advance as a Data Type Definition (DTD). If the structure is defined based on the paper form, the structure does not necessarily reflect what the doctors and nurses do in their everyday practise. The structure should reflect the use and reuse of the content elements. The object-oriented design methods produce this kind of information. The paper represents the method to generate the SGML DTD with the use of the UML diagrams.


Report A/1999/5
Tervo, P. Kolmonen
A model for the control of multileaf collimator and related inverse planning

A new approach for the inverse treatment planning in radiation therapy with the multileaf collimator (MLC) technique is presented. The application of the MLC-techniques requires an algorithm for the computation of the positions or velocities of leaves as a function of time such that the prescribed dose in the patient space is obtained. First the intensity distribution in the treatment space is determined applying some inverse treatment techniques. Then an MLC control algorithm which generates the determined intensity distribution is required. In the present paper a mathematical model is given which controls the MLC such that the intensity distribution is obtained under selected dose constraints. In addition an MLC control algorithm not requiring an intensity distribution as an intermediate step is expressed. The algorithm uses the MLC leaf positions and delivery times directly in inverse treatment planning. The method can easily be implemented to multiple static treatments ("step-and-shoot") but a modification to in inverse treatment planning. The method can easily be implemented to multiple static treatments ("step-and-shoot") but a modification to dynamical MLC techniques is also straightforward.


Report A/1999/4
J. Tervo, P. Kolmonen, T. Lyyra-Laitinen, J.D. Pinter, T. Lahtinen
An optimization-based approach to the multiple static delivery technique in radiation therapy.

This paper considers the intensity modulated radio therapy (inverse) tratment planning. An approach to determine the trajectories of the leaves of the multileaf collimator (MLC) in order to produce the prescribed intensity distribution is developed. The paper concentrates on the multiple static delivery technique. A mathematical model for calculation the intensity distribution with the help of locations of the leafheads of subsequent subfields is constructed. Furthermore an optimization model in which the decision variables are the locations of leafheads is developed. The relevant constraints are considered as well. The optimization problem is a large dimensional constrained nonlinear global extremum problem. It is solved by the LGO (Lipschitz (Continuous) Global Optimizer) program system. Comparisons with other optimization method (Hooke-Jeeves iteration) are ncluded. Numerical experiments are presented to confirm the functionality of the method.


Report A/1999/3
Anssi Kautonen, Ville Leppänen, Martti Penttonen
Generalized thinning protocols for routing h-relations in complete networks

We present a routing algorithm called generalized thinning algorithm for complete networks under OCPC assumption. This algorithm generalizes earlier versions of thinning, which were proved to be competitive in comparison with other algorithms found in literature.


Report A/1999/2.
Anne Eerola
Disciplined Approach to the Maintenance of the Class Hierarchy

Encapsulation and the definition of objects in the respective classes facilitates the modification of object-oriented systems, but the class hierarchy may also be a source of problems. First, before changing the definition of a class, the analyst must verify that the change is coherent with regard to subclasses of the class recursively. Second, the reusability of the properties and the uniformity and functionality of the system can be increased by defining the properties as high as possible in the class hierarchy. Consequently, the maintainer has to navigate up and down while investigating and implementing the changes in the class hierarchy. In this work it is shown that the maintenance is a disciplined process that can be well supported with object-oriented algorithms. Index Terms - object-oriented, maintenance, object, class, class hierarchy, inheritance, reuse.


Report A/1999/1.
E. Kuikka, A. Eerola, J. Porrasmaa, A. Miettinen, J. Komulainen
Design of the SGML-based electronic patient record system with the use of object-oriented analysis methods

A patient record is typically a document updated by many users, required to be represented in many different layouts, and transfered from place to place. It is also an object for various types of queries. Opposite to the fixed-length data typical for administrative data the clinical data is usually represented as a free text. Thus, the patient record is a good candidate to be represented structured and coded using the SGML document standard.

The use of the SGML requires that the structure of the document is defined in advance by a Document Type Definition (DTD) and the document follows it. If the structure is defined based on the paper form, the structure does not necessary reflect what the doctors and nurses actually make in their everyday practise. This paper represents a method which derives an SGML DTD by starting from the description of the usage of the patient record in medical care and nursing.


Report Series B
Report B/2005/2
Timo Laitinen
Asiantuntijajärjestelmän soveltuvuus metsänuudistamisen päätöksenteon tukemiseen

Metsänuudistaminen on monimutkainen päätöksentekoprosessi. Päätöksentekoa voidaan helpottaa ja parantaa kehittämällä ongelmaan asiantuntijajärjestelmä. Metsänuudistaminen täyttää asiantuntijajärjestelmien sovelluskohteille asettamat vaatimukset, joista tärkeimpiä ovat ongelman selkeästi rajattavuus, monimutkaisuus, heuristisuus ja tietämyksen saatavuus. Yksi keskeisimmistä hyödyistä asiantuntijajärjestelmästä metsänuudistamiseen on päätöksenteon hallittavuuden lisääntyminen.


Report B/2005/1
Keijo M.J. Haataja
Detailed descriptions of new proof-of-concept Bluetooth security analysis tools and new security attacks

This report describes the details of two new proof-of-concept Bluetooth security analysis tools and two new attacks against Bluetooth security. On-Line PIN Cracking script is a security analysis tool for on-line Bluetooth device PIN cracking. Brute-Force BD_ADDR Scanning script is a security analysis tool for brute-force discovery of the addresses of Bluetooth devices that want to be private. Scripts of both our security analysis tools exist and can be demonstrated to Bluetooth device manufacturers or press if required, but they will not be released in any public domain because due to their efficiency they can be very dangerous. Our new attacks, BTKeylogging and BTVoiceBugging, extend On-Line PIN Cracking attack.


Report B/2004/3
S. Räsänen
Verkko-opetuksen tietotekniikkaa - Simulaatio opetuksessa

Tietokoneita ja tietoverkkoja käytetään opetuksessa ja opetuksen tukena yleisesti. Tyypillisiä käyttökohteita ovat sähköposti, www-sivut, verkko-oppimisalustat sekä audio- ja videoneuvottelut. Tietotekniikan tarjoamia palveluita käytetään niin oppilaitoksissa kuin henkilöstökoulutuksessa yrityksissä. Yleistymiseen ovat vaikuttaneet tietokoneiden määrän lisääntyminen ja käyttötaitojen kehittyminen. Lisäksi yleistymistä on edesauttanut parantuneet tietoliikenneyhteydet.

Simulaatioita voidaan toteuttaa monella tavalla. Yhtenä tapana on simuloida tilanne, jossa toimijoina ovat henkilöt. Toisaalta voidaan rakentaa laite, jonka avulla toteutetaan simulaatio. Kolmas tapa simulaation toteuttamiselle on tietokoneiden ja tietoverkkojen käyttö. Tietokoneohjelmia voidaan laatia siten, että opiskelija toimii mallinnetussa ympäristössä ja tekee simulaatiossa harjoitteita. Laajemmassa mittakaavassa simulaatio on toteutettu verkottuneesti, jolloin samaa harjoitusta tekee moni henkilö, ja jokaisella henkilöllä on oma rooli. Tietoliikenne toteutetaan tietoverkkojen avulla. Tietokoneiden avulla rakennetuissa simulaatioissa voi olla ohjaaja myös mukana mm. antamassa neuvoja, vihjeitä, muuttamassa harjoiteltavaa tilannetta ja arvioimassa.

Simulaatio mahdollistaa tilanteiden harjoittelun, joita ei voi toteuttaa turvallisesti reaalimaailmassa. Lento-onnettomuutta ei toteuteta harjoittelun takia, vaan onnettomuustilanteessa tulee jokaisen toimijan osata tehdä tarvittavat toimenpiteet oikein. Mm. onnettomuustilanteet, vaarallisten aineiden käsittelyt, välineiden käytöt, laitteiden huollot, prosessin ohjaus ja taktiikan opiskelu ovat erittäin hyviä kohteita simulaatiolle. Simulaatioharjoittelun jälkeen voivat opiskelijat ja ohjaajat arvioida toimintaa ja oppia harjoittelun avulla erilaisia toimintamalleja ja niiden vaikutuksia todellisissa tilanteissa.

Tässä raportissa kuvataan mitä simulaatio on ja miten simulaatiota voidaan käyttää opetuksessa. Lisäksi raportissa kuvataan verkottunut simulaatio ja siihen liittyvä HLA-standardi sekä PC-simulaation tuotanto.

Avainsanat: Simulaatio, simulaatio opetuksessa, vuorovaikutteinen simulaatio.


Report B/2004/2
M. Marttila-Kontio, R. Honkanen
Toimintatutkimus tietojenkäsittelytieteen opetuksen kehittämisestä Kuopion yliopistossa

Tietokoneiden käyttö on lisääntynyt voimakkaasti viimeisten vuosikymmenien aikana. Samanaikaisesti sekä tietotekniikka että ohjelmistot ovat monimutkaistuneet. Nämä seikat ovat lisänneet tarvetta tietokoneiden käytön ja tietojenkäsittelytieteen opetukseen. Tietojenkäsittelytieteen opetus on perinteisesti perustunut pääosin luentoihin, harjoituksiin ja pienryhmissä tehtäviin projektitöihin. Perinteisen luentotyyppisen opetuksen avulla pystytään siirtämään helposti suurelle joukolle yksiselitteiset perustiedot. Jotta opiskelija saataisiin motivoitumaan ja ajattelemaan kriittisesti, tarvitaan mahdollisesti uudentyyppisiä opetus- ja arviointimenetelmiä.

Tämän tutkimuksen päämääränä oli tutkia toimintatutkimuksen keinoin uudentyyppisten opetus- ja arviointimenetelmien käyttöä tietojenkäsittelytieteen opetuksessa. Opetusmenetelmiä arvioitiin Diskreetin matematiikan, Optisen kommunikaation ja Tietokonejärjestelmien kursseilla. Tämän tutkimusraportin alkuosassa käymme kirjallisuuteen perustuen lyhyesti läpi opetuksen ja oppimisen pääperiaatteita sekä tutkimuksen lähtökohdat Kuopion yliopiston tietojenkäsittelytieteen laitoksella. Loppuosassa esitämme yleissuunnitelmat erityyppisten kurssien kehittämiseksi ja raportoimme käytetyistä menetelmistä saadut palautteet.

Tutkittuja opetusmenetelmiä olivat opintopäiväkirjan ja visualisaattoreiden käyttö luento-opetuksen tukena, ryhmäytys laskuharjoituksissa sekä laajojen esseetyyppisten loppukuulustelutehtävien julkaisu etukäteen. Tutkimustemme perusteella luentopäiväkirjan käyttö parantaa oppimistuloksia, kunhan sen teko ohjeistetaan riittävän selkeästi. Visualisaattoreiden käyttö erilaisten ilmiöiden havainnollistajana otettiin myönteisesti vastaan. Opiskelijoiden jako ryhmiin ja laskuharjoitusten teko ryhmätyönä harjoitustilaisuuden alussa koettiin myös myönteiseksi. Opintojakson sisällön kannalta järkevästi jäsenneltyjen esseetehtävien julkaisu etukäteen näytti motivoivan opiskelijoita opiskelemaan silloin, kun kurssin sisältö oli laaja ja hajanainen.

Avainsanat: Toimintatutkimus, tietojenkäsittelytieteen opetus.


Report B/2004/1
K. Hyppönen
Providing Karelian Language Support for an Educational Linux System

In this paper we describe a Linux distribution that is meant to provide a suitable educational and working environment for users that need support of Karelian language in the system. The distribution is CD-based, and can be booted up from the compact disc or installed on a computer hard drive.

We explain considerations behind the choice of the Knoppix platform for our distribution and list the main contents of the CD. We proceed then to the main topic of the paper, namely to description of the steps needed for introduction of Karelian language to Knoppix and to Linux in general. The final part of the report discusses possible ways of keeping the distribution up-to-date.

Keywords: Linux, Knoppix, Karelian language, computer science education.