Educational Multimedia Library

Educational Multimedia Library
Project Description
Software Distribution
Publications
Offene Diplomarbeiten

Teachware
CBT: Multimedia Technology SS1999
CBT: Computer Networks WS1998/99
CBT: Multimedia Technology SS1998

Related Projects
RTP/I
VIROR
Digital Lecture Board
Interactive Home Learning
VCR on Demand

Back to PI IV Homepage


V3D2 Digital Library Initiative DFG

Educational Multimedia Library Project


Motivation

In the context of globalized education markets, the export of courses is increasingly important for universities. The participants in such courses are going to be geographically distributed over diverse locations. An important prerequisite for such a distance education scenario is the location-independent access to educational documents for all participants.

Multimedia elements (videos, animations, simulations, etc.) can significantly increase the quality of educational material. However, only a small number of educational documents is actually enriched with multimedia elements. One reason for this situation is the high effort usually required for the production of multimedia material, another being that the current Internet-WWW technology is focused on the distribution of discrete media.

To overcome these limitations, an Internet-based, multimedia teachware server is being developed in the Educational Multimedia Library Project. The server is to provide multimedia documents for use within different (synchronous and asynchronous) education scenarios. The main focus of the project lies on the transmission of continuous and interactive media and on the storage of these media types on a server.

Results of Project Phase I

Presentation of Multimedia Documents on a Teachware Server

One of the main obstacles when publishing lecture material on a teachware server is that the material usually is available in an authoring tool specific format. We call the format produced by the authoring tool the primary format. A primary format is usually neither suited for the presentation of lecture material in an asynchronous teleteaching scenario (e.g. on the Web), nor is it suited for the presentation in a synchronous scenario (e.g. during a lecture using a shared whiteboard). Since both scenarios have different requirements which should be taken into consideration so as to provide a flexible use of documents on a teachware server, a conversion from one primary format to multiple presentation formats is required. Besides the conversion of document formats, the asynchronous scenario requires the application of a clear instructional design that guides a learner through the presented material.

We have implemented a conversion and structuring process that fulfills the above requirements (see Fig. 1 Conversion and structuring process). In a first step, the author edits a document using his favorite authoring tool. In the second step the presentation formats are generated. In the third step of the authoring process, the HTML document resulting from step two is post-processed for its presentation on the Web. The HTML document is splitted and cross-linked using an instructional design. In this step, further multimedia material can be integrated into the documents. For example, we have recorded all lectures held during the past semesters. These recorded audio and videoclips can be linked to the generated documents accordingly. As a result, a multimedia Computer-Based-Training (CBT) unit is created. [11]

Using this process we have created multimedia CBTs for complete courses on "Computer Networks" [1], "Multimedia Technology" [2] (see Fig. 2 Screenshot of the CBT) and for several chapters of a course on "Java". Our CBTs contain the slides of a lecture linked together with the corresponding audio and video recordings, with Java-animations and with online exercises. Our production process is automated to a large extent so that the generation of a CBT during a semester requires only one student assistant. The access statistics of our server and very positive feedback from students prove that our CBT is very well accepted by our students.

Fig. 1: Conversion and structuring process

Fig. 2: Screenshot of the CBT

Recording and Playback Server for Shared Whiteboard Streams

The Multicast Backbone (MBone) provides the infrastructure for efficient multipoint data delivery in the Internet. In the synchronous teleteaching scenario, this infrastructure is exploited to transmit a lecture to multiple lecture rooms using audio and video conferencing facilities together with shared whiteboard applications. The data streams transmitted between the distributed lecture rooms (audio, video and shared whiteboard) can be captured, stored and played back on demand enabling to review the lectures.

In the Internet, many tools exist that accomplish the task of recording audio and video streams. However, little work has been done on the recording of media streams other than audio and video. The main problem in implementing such a recorder is that the characteristics of the media streams produced by shared whiteboard applications are different from those of audio and video streams: The decoding of a shared whiteboard media stream in principle requires the receivers to hold the full state of the medium. It is not possible to start the decoding of a shared whiteboard stream at an arbitrary position in the stream. For example, a draw-line event cannot be decoded without having the page on which the line should be drawn. In particular, random access to a recorded stream requires that a recorder provides the current media state to the receivers.

We have developed a novel paradigm to record and play back shared whiteboard data streams that provides an efficient and flexible solution to this problem. Our paradigm is based on a model for shared whiteboard media streams which assumes an event-based structure of these streams. As a main feature, our recording scheme enables efficient random access to recorded streams. In contrast to other approaches, it does not require the restoration of the full shared whiteboard state when accessing a recorded media stream. Instead, only those parts of the state are transmitted that are actually required during playback. As the internal state of a shared whiteboard may comprise a very large amount of data (e.g. a number of postscript slides, gif images etc.), our algorithm significantly reduces the amount of data transmitted over the network during each random access operation. We have implemented our proposed recording scheme in a recorder for the digital lecture board (dlb) [5], which extends the MBone VCR on Demand (MVoD) system [13].

A Generic Transmission Protocol for Interactive Media

Even though our algorithms for the recording and playback of shared whiteboard media streams are generic, the implementation of a shared whiteboard recorder must consider the application-layer protocol of an actual shared whiteboard. Currently, no common shared whiteboard protocol exists that would enable a more generic recorder. Generally speaking, no common application-layer protocol framework exists for the class of interactive media with real-time characteristics, i.e. media involving user interaction. Examples of interactive media are: shared whiteboard applications, multiuser VRML models and distributed Java animations. Existing approaches to define application-layer protocols for the distribution of some interactive media are mostly proprietary. This prevents interoperability and hinders any sharing of common tools like a generic recording service, while requiring re-implementation of similar functionality for each protocol.

In contrast, the development of application-layer protocols for the real-time distribution of audio and video has been a focus of research for several years. Most notable is the success of the Real-Time Transport Protocol (RTP) [14]. RTP is a protocol that must be tailored to the specific needs of different media and media classes. It is therefore accompanied by documents describing the specific encoding of different media types within the RTP framework.

In order to establish a common foundation for the class of interactive media, we have developed the RTP/I protocol [9] [10], which is a general, RTP-based, application-layer protocol for the distribution of this media class. RTP/I is built around a very generic model of interactive media. This model assumes that an interactive medium is well defined by its current state at any point in time. The state of an interactive medium can change for two reasons, either by passage of time or by events. A typical example of an event is the interaction of a user with the medium. An example of a state change caused by the passage of time might be the animation of an object moving across the screen. For a certain medium the RTP/I protocol can be instantiated by providing medium-specific information, reusing the infrastructure set up by RTP/I. RTP/I itself captures the common aspects of the interactive media class, enabling the reuse of existing code and the development of generic services like recording or late join.

Objectives of Project Phase II

In project phase II the prototype implemented in phase I is to be improved and enlarged. The main emphasis of project phase II lies on the transmission and storage of continuous and interactive media and the integration of these media types with discrete media. In detail, the following objectives are to be achieved in phase II:

  • For the class of interactive media there are currently no standardized application- and transport-layer protocols. This prevents the interoperability of interactive media applications and sharing of common tools and thus disables the use of interactive media documents across different systems.
    The development of the RTP/I protocol for interactive media begun in project phase I is to be continued and a full specification in RFC style is to be created. The main idea of RTP/I is that it should combine the common aspects of the class of interactive media while being open towards the requirements of specific media types. An important aspect to be investigated is that interactive media types will have different requirements concerning reliability in transfer over the network. Thus, RTP/I should not implement reliability but should be able to co-exist with reliable multicast protocols or reliable multicast protocol frameworks. Furthermore, RTP/I is to provide mechanisms on which a application may implement a policy for consistency of the distributed media states. This policy may again depend on the specific requirements of a interactive media application.
  • Generic services can be developed for interactive media types. Examples are the generic recording service, a service that enables the change of the visualization or playback speed, a generic late join service, etc. To enable the use of such a service during a transmission, a user must be informed about its presence and about the parameters supported by this service. For this reason, a protocol is required that allows the announcement of generic services.
  • A large number of generic recording services exist for the class of audio and video. Video-on-demand servers are capable of recording, storing and playing back videoclips irrespective of the specific video encoding. In contrast to this, no such generic recording and playback service exists for the interactive media class, which is particularly relevant to educational documents. Common practice is either to put documents of this media class on a web-server for complete download with no support for the streaming of relevant parts, or to develop a proprietary server for a specific media type.

  • To overcome this limitation a generic media-on-demand server is to be developed that can record and play back audio, video and interactive media streams. To achieve this goal, an existing video-on-demand server is to be enlarged by a component for the recording and playback of interactive media streams. The main technical challenge is to provide generic random access to the stored media streams. As described for shared whiteboard streams, interactive media streams in general require the initialization of receivers with the correct media state before a playback may be started. Thus, algorithms are to be developed that operate on the common RTP/I protocol elements and provide a media-independent initialization of receivers. The principles of such algorithms are described in [12] and [6].
  • Currently a tight integration of discrete media types and continuous and interactive media types within documents on the Web is hardly possible. Single media elements can be integrated into a Web document using specific plug-ins which operate as autonomous units within the web browser. This prevents user operations on the overall document. To overcome this limitation a Java-Applet is to be developed that enables the control of multiple media streams. The separation of navigational control and media decoding enables user interaction with an integrated multimedia document and facilitates the integration of new media types.

Acknowledgments

This work is supported by the Deutsche Forschungsgemeinschaft (DFG) within the V3D2 Digital Library Initiative.

Project Publications

[1] W. Effelsberg. CBT zur Vorlesung Multimedia Technik. URL: http://www-mm.informatik.uni-mannheim.de/veranstaltungen/ss1999/multimedia/, 1999.
[2] W. Effelsberg. CBT zur Vorlesung Rechnernetze. URL: http://www-mm.informatik.uni-mannheim.de/veranstaltungen/ws199899/rechnernetze/, 1999.
[3] V. Hilt. The Recording of Interactive Media Streams Using a General Framework. Technical Report TR 14-98, University of Mannheim, Germany, 1998.
[4] V. Hilt. The Educational Multimedia Library Project. URL: http://www.informatik.uni-mannheim.de/informatik/pi4/projects/emulib/, 2000.
[5] V. Hilt, W. Geyer, W. Effelsberg. A New Paradigm for the Recording of Shared Whiteboard Streams, Proc. SPIE Multimedia Computing and Networking (MMCN'00), San Jose, California, USA, 2000.
[6] V. Hilt, M. Mauve, C. Kuhmünch, W. Effelsberg. A Generic Scheme for the Recording of Interactive Media Streams. Proc. International Workshop on Interactive Distributed Multimedia Systems and Telecommunication Services 1999 (IDMS'99), Toulouse, France, M. Diaz et.al. (Eds.), LNCS 1718, Springer Verlag, Berlin, Germany, pp. 291-304, 1999.
[7] V. Hilt, C. Kuhmünch. New Tools for Synchronous and Asynchronous Teaching and Learning in the Internet. Proc. World Conference on Educational Multimedia and Hypermedia & Educational Telecommunications 1999 (ED-MEDIA & ED-TELECOM'99), Seattle, USA, AACE, 1999. Available on CD-ROM, contact: http://www.aace.org/pubs/.
[8] S. Lucks, R. Weis, V. Hilt. Fast Encryption for Set-Top Technologies. Proc. SPIE MCN'99, Multimedia Computing and Networking, San Jose, California, USA, Vol. 3654, pp. 84-94, 1999.
[9] M. Mauve, V. Hilt, C. Kuhmünch, W. Effelsberg. A General Framework and Communication Protocol for the Transmission of Interactive Media with Real-Time Characteristics. Proc. IEEE International Conference on Multimedia Computing and Systems (ICMCS'99), Florence, Italy, IEEE, 1999.
[10] M. Mauve, V. Hilt, C. Kuhmünch, W. Effelsberg. A General Framework and Communication Protocol for the Real-Time Transmission of Interactive Media. Technical Report TR 16-98, University of Mannheim, Germany, 1998.
[11] C. Schremmer, V. Hilt. A Systematic Approach to the Automatic Conversion of a "Live" Lecture into a Multimedia CBT Course. Proc. 2nd international Conference on New Learning Technologies (NLT'99), Berne, Switzerland, 1999.

Other References

[12] W. Geyer, W. Effelsberg. The Digital Lecture Board - A Teaching and Learning Tool for Remote Instruction in Higher Education. Proc. World Conference on Educational Multimedia and Hypermedia & Educational Telecommunications 1998 (ED-MEDIA & ED-TELECOM'98), Freiburg, Germany, AACE, 1998. Available on CD-ROM, contact: http://www.aace.org/pubs/.
[13] W. Holfelder. Interactive Remote Recording and Playback of Multicast Videoconferences. Proc. International Workshop on Interactive Distributed Multimedia Systems and Telecommunication Services 1997 (IDMS'97), Darmstadt, Germany, R. Steinmetz, L. Wolf (Eds.), LNCS 1309, Springer Verlag, Berlin, pp. 450-463, September 1997.
[14] H. Schulzrinne, S. Casner, R. Frederick, V. Jacobson. RTP: A Transport Protocol for Real-Time Applications. Internet Draft, Audio/Video Transport Working Group, IETF, draft-ietf-avt-rtp-new-03.txt, 1999.

Volker Hilt
Last modified: Fri Jan 14 19:09:13 MET 2000