Intelligent Video Conferencing

Status Report

Conclusion


Overall, the Intelligent Video Conferencing project is progressing well. The video conferencing portion of the project was held up slightly waiting for documentation from Connectix as to how to control the QuickCam, though the "extra" time was spent doing further research into existing implementations of similar projects. After the cameras arrived, some time was also spent to study the operation of the camera using some functional software that accompanies the QuickCam. (This software is designed for snapshots or strobe shots and not motion video.) At this point, flowcharts detailing the subsystem operations are complete and the resulting psuedocode is being written. Implementation as a C program will soon follow.

The team continues to explore the Java language and refine the interface as new needs become apparent. While currently only a conceptual image, the mock-up of the interface provides a means to focus on the various aspects of the project in a format other than just text. When all other components are complete, the interface specifications will have then been fully developed and coding can begun.

After the video system is complete, the audio subsystem will be the next to be implemented. As the programming specifics are not known to the design team, as none have worked with programming for a SoundBlaster-compatible system previously, research is already underway to understand the needs in this area. Following this will be the networking code where the key will be to optimize packet composition and compression in order to transmit nearly full-motion video along with an audio stream without encountering unacceptable delays due to the constraints of network bandwidth.

In the Image Following group, testing has been done with the transducers that will be implemented and the design of the needed interface circuitry and control software is progressing based on these results. At the same time, the motors were acquired and the interface circuitry and control software for them is also progressing. The design of the assembly is also underway and soon after its completion it should be possible to integrate the two parts of the Image Following subsystem so that testing and design improvements can begin.

At this time, all needed parts have been obtained for the developement to proceed. The prototype system will allow for two stations to engage in a conference, though only one will have the image following capability. Because the system is being marketed as an third-party accessory for those already owning a QuickCam and sound card, some of the parts required for a commercial version will be different that those needed for development. Examples are the QuickCams, which would not be needed for the commercial version, and the Motorola EVB, which is designed for testing systems and would be replaced by a scaled down system contained the MC68HC11 microprocessor. However, it would be beneficial and recommended that a bundled product be "developed" for those not already owning a QuickCam and sound card so that they could purchase one box and get everything they need.


Return to the index or continue to the appendix