<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
	<id>https://wiki.multimedia.cx/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Reimar</id>
	<title>MultimediaWiki - User contributions [en]</title>
	<link rel="self" type="application/atom+xml" href="https://wiki.multimedia.cx/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Reimar"/>
	<link rel="alternate" type="text/html" href="https://wiki.multimedia.cx/index.php/Special:Contributions/Reimar"/>
	<updated>2026-04-28T08:51:32Z</updated>
	<subtitle>User contributions</subtitle>
	<generator>MediaWiki 1.39.5</generator>
	<entry>
		<id>https://wiki.multimedia.cx/index.php?title=FFmpeg_Summer_of_Code_2014&amp;diff=14954</id>
		<title>FFmpeg Summer of Code 2014</title>
		<link rel="alternate" type="text/html" href="https://wiki.multimedia.cx/index.php?title=FFmpeg_Summer_of_Code_2014&amp;diff=14954"/>
		<updated>2014-02-05T19:04:25Z</updated>

		<summary type="html">&lt;p&gt;Reimar: /* Hardware Acceleration API Software/Tracing Implementation */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This page is on wiki.multimedia.cx due to trac.ffmpeg.org being down for maintaince. It might be moved back later.&lt;br /&gt;
&lt;br /&gt;
= Introduction =&lt;br /&gt;
&lt;br /&gt;
FFmpeg is the universal multimedia toolkit: a complete, cross-platform solution to record, convert, filter and stream audio and video. It includes libavcodec - the leading audio/video codec library.&lt;br /&gt;
&lt;br /&gt;
[https://developers.google.com/open-source/soc/ Google Summer of Code (GSoC)] is a program that offers students stipends to write code for open source projects. Through the guidance of mentors, students gain valuable experience interacting with and coding for open source projects like FFmpeg. Additionally, the project and its users benefit from code created from students who often continue contributing as developers. FFmpeg participated to several past editions ([[FFmpeg Summer Of Code 2006|2006]], [[FFmpeg Summer Of Code 2007|2007]], [[FFmpeg Summer Of Code 2008|2008]], [[FFmpeg Summer Of Code 2009|2009]], [[FFmpeg Summer Of Code 2010|2010]], and [[FFmpeg / Libav Summer Of Code 2011|2011]]), and we are looking forward to being involved this year.&lt;br /&gt;
&lt;br /&gt;
This is our ideas page for [http://www.google-melange.com/gsoc/homepage/google/gsoc2014 Google Summer of Code 2014]. See the GSoC Timeline for important dates.&lt;br /&gt;
&lt;br /&gt;
== Information for Students ==&lt;br /&gt;
&lt;br /&gt;
=== Getting Started ===&lt;br /&gt;
&lt;br /&gt;
0. '''Get to know FFmpeg.''' If you are a student and interested in contributing to an FFmpeg GSoC project it is recommended to start by subscribing to the [http://ffmpeg.org/mailman/listinfo/ffmpeg-devel ffmpeg-devel] mailing-list, visiting our IRC channels (''#ffmpeg-devel'' and ''#ffmpeg''), and exploring the codebase and the development workflow. Feel free to [[#Contacting_FFmpeg|contact us]] if you have any questions.&lt;br /&gt;
&lt;br /&gt;
1. '''Find a project.''' Listed on this page are mentored and unmentored projects. Mentored projects are well-defined and mentor(s) have already volunteered. Unmentored projects are additional ideas that you may consider, but you will have to contact us to find a mentor. You may also propose your own project that may be a better match for your interest and skill level.&lt;br /&gt;
&lt;br /&gt;
2. '''Contact us.''' If you find a project that you are interested in then get in touch with the community and let us know. In case you want to work on a qualification task, you should ask the respective mentor(s) so that the task can be claimed.&lt;br /&gt;
&lt;br /&gt;
3. '''Apply.''' Student proposal period begins 10 March 19:00 UTC and ends 21 March 19:00 UTC. See the See the [http://www.google-melange.com/gsoc/document/show/gsoc_program/google/gsoc2014/help_page#2._What_is_the_program_timeline GSoC timeline] for additional information.&lt;br /&gt;
&lt;br /&gt;
=== Qualification Tasks ===&lt;br /&gt;
&lt;br /&gt;
In order to get accepted you will be requested to complete a small task in the area you want to contribute. FFmpeg GSoC projects can be challenging, and a qualification task will show us that you are motivated and have the potential to successfully finish a project.&lt;br /&gt;
&lt;br /&gt;
The qualification task is usually shown in the project description. Contact the respective mentor(s) for assistance on getting a related qualification task or if you want to propose your own. Browse the [https://trac.ffmpeg.org FFmpeg Bug Tracker] for qualification task ideas.&lt;br /&gt;
&lt;br /&gt;
=== Contacting FFmpeg ===&lt;br /&gt;
&lt;br /&gt;
If you have questions or comments feel free to contact us via our mailing list, IRC channel, or e-mail one of the FFmpeg GSoC admins:&lt;br /&gt;
&lt;br /&gt;
* '''Mailing-list:''' [http://ffmpeg.org/mailman/listinfo/ffmpeg-devel ffmpeg-devel]&lt;br /&gt;
* '''IRC:''' ''#ffmpeg-devel'' on Freenode&lt;br /&gt;
* '''FFmpeg GSoC Admins:''' TBA&lt;br /&gt;
&lt;br /&gt;
You can also contact a mentor directly if you have questions specifically related to one of the projects listed on this page.&lt;br /&gt;
&lt;br /&gt;
= Mentored Projects =&lt;br /&gt;
&lt;br /&gt;
This section lists well-defined projects that have one or more available mentors. If you are new to FFmpeg, and have relatively little experience with multimedia, you should favor a mentored project rather than propose your own. Contact the respective mentor(s) to get more information about the project and the requested qualification task.&lt;br /&gt;
&lt;br /&gt;
== H.264 Multiview Video Coding (MVC) ==&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div class=&amp;quot;floatright&amp;quot;&amp;gt;[[Image:Mmspg-epfl-ch-double-camera.jpg]]&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
'''Description:''' MVC samples exist and the codec is used on Blu-ray media, but FFmpeg is missing a decoder. Since this project also consists of some changes in the current architecture, it is especially important that this project is discussed on the [http://ffmpeg.org/mailman/listinfo/ffmpeg-devel ffmpeg-devel mailing list].&lt;br /&gt;
&lt;br /&gt;
'''Expected results:''' Create MVC decoder and add a test for the FFmpeg Automated Testing Environment (FATE).&lt;br /&gt;
&lt;br /&gt;
'''Prerequisites:''' C coding skills, basic familiarity with git.&lt;br /&gt;
&lt;br /&gt;
'''Qualification Task:''' Perform work that demonstrates understanding of MVC and that is a subpart of the whole MVC implementation.&lt;br /&gt;
&lt;br /&gt;
'''Mentor:''' TBA, possibly [[User:Michael|Michael Niedermayer]] (''michaelni'' in #ffmpeg-devel on Freenode IRC)&lt;br /&gt;
&lt;br /&gt;
'''Backup mentor:''' TBA, possibly Kieran Kunhya (''kierank'' in #ffmpeg-devel on Freenode IRC)&lt;br /&gt;
&lt;br /&gt;
== Animated Portable Network Graphics (APNG) ==&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div class=&amp;quot;floatright&amp;quot;&amp;gt;[[Image:Animated PNG example bouncing beach ball.png]]&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
'''Description:''' FFmpeg currently does not support Animated PNGs, the goal of this project is to change that and add support. The little bouncing ball animation shown to the right is such a APNG file.&lt;br /&gt;
&lt;br /&gt;
'''Specification:''' https://wiki.mozilla.org/APNG_Specification&lt;br /&gt;
&lt;br /&gt;
'''Expected results:'''&lt;br /&gt;
* APNG demuxer&lt;br /&gt;
** implement robust probing:&lt;br /&gt;
*** PNG images are not misdetected as APNG animations&lt;br /&gt;
*** APNG animations are not misdetected as PNG images&lt;br /&gt;
** splits stream into sensible packets (so they can be easily reused in APNG muxer)&lt;br /&gt;
** survives fuzzing (zzuf)&lt;br /&gt;
** add FATE coverage, coverage should be at least 70%&lt;br /&gt;
** test code under valgrind so no invalid reads/writes happen&lt;br /&gt;
&lt;br /&gt;
* APNG decoder&lt;br /&gt;
** use existing PNG decoder code (write decoder in same file)&lt;br /&gt;
** implement parsing of all APNG chunks (acTL, fcTL, fdAT)&lt;br /&gt;
** error handling&lt;br /&gt;
** survives fuzzing (zzuf)&lt;br /&gt;
** add test for FATE, coverage should be at least 75%&lt;br /&gt;
** CRC checksum validation&lt;br /&gt;
** test code under valgrind so no invalid reads/writes happen&lt;br /&gt;
&lt;br /&gt;
* APNG muxer &amp;amp;&amp;amp; APNG encoder&lt;br /&gt;
** use existing PNG encoder code (write encoder in same file)&lt;br /&gt;
** write compliant files, make sure they play correctly in major web browsers that support APNG&lt;br /&gt;
** add test for FATE&lt;br /&gt;
&lt;br /&gt;
'''Prerequisites:''' C coding skills, basic familiarity with git.&lt;br /&gt;
&lt;br /&gt;
'''Qualification Task:''' Implement format autodetection for imagepipe and image demuxer.&lt;br /&gt;
&lt;br /&gt;
'''Mentor:''' [[User:Pbm|Paul B Mahol]] (''durandal_1707'' in #ffmpeg-devel on Freenode IRC)&lt;br /&gt;
&lt;br /&gt;
'''Backup mentor:''' TBA, possibly [[User:Suxen_drol|Peter Ross]] (''pross-au'' in #ffmpeg-devel on Freenode IRC)&lt;br /&gt;
&lt;br /&gt;
== FFv1 P frame support ==&lt;br /&gt;
&lt;br /&gt;
'''Description:''' FFv1 is one of the most efficient intra-only lossless video codecs. Your work will be to add support for P frames with motion compensation and motion estimation support (the existing motion estimation code in libavcodec can be reused here). Then fine-tune it until the best compression rate is achieved. This will make FFv1 competitive with existing I+P frame lossless codecs like lossless H.264.&lt;br /&gt;
&lt;br /&gt;
'''Expected results:''' State of the art P frame support in the FFv1 encoder and decoder implementation.&lt;br /&gt;
&lt;br /&gt;
'''Prerequisites:''' C coding skills, basic familiarity with git, solid understanding of video coding especially with motion compensation.&lt;br /&gt;
&lt;br /&gt;
'''Qualification Task:''' Implement support for simple P frames without motion compensation in FFv1. That is so that each frame stores the difference to the previous frame.&lt;br /&gt;
&lt;br /&gt;
'''Mentor:''' [[User:Michael|Michael Niedermayer]] (''michaelni'' in #ffmpeg-devel on Freenode IRC)&lt;br /&gt;
&lt;br /&gt;
'''Backup mentor:''' TBA&lt;br /&gt;
&lt;br /&gt;
== Misc Libavfilter extension ==&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div class=&amp;quot;floatright&amp;quot;&amp;gt;[[Image:Lavfi-gsoc-filter-vintage-illustration.jpg]]&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
'''Description:''' Libavfilter is the FFmpeg filtering library. It currently supports audio and video filtering and generation support. This work may focus on porting, fixing, extending, or writing new audio and video filters from scratch.&lt;br /&gt;
&lt;br /&gt;
Candidate filters for porting may be the remaining MPlayer filters currently supported through the mp wrapper, libaf MPlayer filters, and filters from other frameworks (e.g. mjpegtools, transcode, avisynth, virtualdub, etc.). In case of mp ports, the student should verify that the new filter produces the same output and is not slower.&lt;br /&gt;
&lt;br /&gt;
Some ideas for more filters:&lt;br /&gt;
* a frequency filtering domain filter relying on the FFT utils in libavcodec&lt;br /&gt;
* a controller filter which allows to send commands to other filters (e.g. to adjust volume, contrast, etc.), e.g. like the sendcmd filter but through an interactive GUI&lt;br /&gt;
* a lua scripting filter, which allows to implement filtering custom logic in lua&lt;br /&gt;
&lt;br /&gt;
For more ideas check [https://trac.ffmpeg.org/query?status=new&amp;amp;status=open&amp;amp;status=reopened&amp;amp;component=avfilter&amp;amp;col=id&amp;amp;col=summary&amp;amp;col=status&amp;amp;col=type&amp;amp;col=priority&amp;amp;col=component&amp;amp;col=version&amp;amp;order=priority trac libavfilter tickets].&lt;br /&gt;
&lt;br /&gt;
'''Expected results:''' Write or port audio and video filters and possibly fix/extend libavfilter API and design when required.&lt;br /&gt;
&lt;br /&gt;
'''Prerequisites:''' C coding skills, basic familiarity with git. Some background on DSP and image/sound processing techniques would be a bonus but is not strictly required.&lt;br /&gt;
&lt;br /&gt;
'''Qualification Task:''' write or port one or more filters&lt;br /&gt;
&lt;br /&gt;
'''Mentor:''' TBA, possibly [[User:Stefanosa|Stefano Sabatini]] (''saste'' in #ffmpeg-devel on Freenode IRC)&lt;br /&gt;
&lt;br /&gt;
'''Backup mentor:''' [[User:Ubitux|Clément Bœsch]] (''ubitux'' in #ffmpeg-devel on Freenode IRC)&lt;br /&gt;
&lt;br /&gt;
== Subtitles ==&lt;br /&gt;
&lt;br /&gt;
'''Description:''' FFmpeg has been working on improving its subtitles support recently, notably by adding the support for various text subtitles and various hardsubbing (burning the subtitles onto the video) facilities. While the theme may sound relatively simple compared to audio/video signal processing, the project carries an historical burden not easy to deal with, and introduces various issues very specific to its sparse form.&lt;br /&gt;
&lt;br /&gt;
'''Expected results:'''&lt;br /&gt;
* Add support for new subtitles formats. Example: a demuxer for .SUP files, just like VobSub but for Blu-Ray, or a VobSub muxer.&lt;br /&gt;
* Improve text subtitles decoders. Typically, this can be supporting advanced markup features in SAMI or WebVTT.&lt;br /&gt;
* Update the API to get rid of the clumsy internal text representation of styles&lt;br /&gt;
* Proper integration of subtitles into libavfilter. This is the ultimate goal, as it will notably allow a complete subtitles rendering for applications such as ffplay.&lt;br /&gt;
* BONUS: if everything goes well, the student will be allowed to add basic support for teletext&lt;br /&gt;
&lt;br /&gt;
'''Prerequisites:''' C coding skills, basic familiarity with git. Some background in fansubbing area (notably ASS experience) would be a bonus but is not strictly required.&lt;br /&gt;
&lt;br /&gt;
'''Qualification Task:''' write one subtitles demuxer and decoder (for example support for Spruce subtitles format). This is in order to make sure the subtitles chain is understood.&lt;br /&gt;
&lt;br /&gt;
'''Mentor:''' [[User:Ubitux|Clément Bœsch]] (''ubitux'' in #ffmpeg-devel on Freenode IRC)&lt;br /&gt;
&lt;br /&gt;
'''Backup Mentor:''' TBA, possibly Nicolas George (''Cigaes'' in #ffmpeg-devel on Freenode IRC)&lt;br /&gt;
&lt;br /&gt;
== Postproc optimizations ==&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div class=&amp;quot;floatright&amp;quot;&amp;gt;[[Image:PostProc.jpg]]&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
'''Description:''' FFmpeg contains libpostproc, which is used to postprocess 8x8 DCT-MC based video and images (jpeg, mpeg-1/2/4, H.263 among others). Postprocessing removes blocking (and other) artifacts from low bitrate / low quality images and videos. The code though has been written a long time ago and its SIMD optimizations need to be updated to what modern CPUs support (AVX2 and SSE2+).&lt;br /&gt;
&lt;br /&gt;
'''Expected results:'''&lt;br /&gt;
* Convert all gcc inline asm in libpostproc to YASM.&lt;br /&gt;
* Restructure the code so that it works with block sizes compatible with modern SIMD.&lt;br /&gt;
* Add Integer SSE2 and AVX2 optimizations for each existing MMX/MMX2/3dnow optimization in libpostproc.&lt;br /&gt;
&lt;br /&gt;
'''Prerequisites:''' C coding skills, good x86 assembly coding skills, basic familiarity with git.&lt;br /&gt;
&lt;br /&gt;
'''Qualification Task:''' convert 1 or 2 MMX2 functions to SSE2 and AVX2.&lt;br /&gt;
&lt;br /&gt;
'''Mentor:''' [[User:Michael|Michael Niedermayer]] (''michaelni'' in #ffmpeg-devel on Freenode IRC)&lt;br /&gt;
&lt;br /&gt;
'''Backup Mentor:''' TBA, possibly [[User:Stefanosa|Stefano Sabatini]] (''saste'' in #ffmpeg-devel on Freenode IRC)&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br clear=&amp;quot;all&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Bayer RGB colorspaces ==&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div class=&amp;quot;floatright&amp;quot;&amp;gt;[[Image:350px-Bayer_pattern_on_sensor.svg.png ]]&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
'''Description:''' Several image and video format store pixels using Bayer-pattern colorspaces. Supporting these format would broaden FFmpeg's applicability to RAW still and video photography processing.&lt;br /&gt;
&lt;br /&gt;
'''Expected results:'''&lt;br /&gt;
* Rebase existing patches&lt;br /&gt;
* Implement high quality bayer transformations in libswscale (plain C)&lt;br /&gt;
* Add bayer formats to the libavutil pixfmt enumeration routines&lt;br /&gt;
* SIMD optimizations of the libswscale transformations&lt;br /&gt;
* Complete PhotoCINE demuxer to support Bayer format; (or another format of your choosing)&lt;br /&gt;
&lt;br /&gt;
Optional goodies:&lt;br /&gt;
* Extend TIFF decoder to support DNG-Bayer format&lt;br /&gt;
* Support a popular proprietary camera format (many to choose from; see dcraw project)&lt;br /&gt;
&lt;br /&gt;
'''Prerequisites:''' C coding skills, basic familiarity with git.&lt;br /&gt;
&lt;br /&gt;
'''Qualification Task:''' Implement a simple and working Bayer-&amp;gt;RGB transform in libswscale&lt;br /&gt;
&lt;br /&gt;
'''Mentor:''' TBA, possibly [[User:Suxen_drol|Peter Ross]] (''pross-au'' in #ffmpeg-devel on Freenode IRC)&lt;br /&gt;
&lt;br /&gt;
'''Backup Mentor:''' [[User:Michael|Michael Niedermayer]] (''michaelni'' in #ffmpeg-devel on Freenode IRC)&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br clear=&amp;quot;all&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== MPEG-4 ALS encoder ==&lt;br /&gt;
&lt;br /&gt;
'''Description:'''&lt;br /&gt;
A MPEG-4 ALS decoder was implemented several years ago but an encoder is still missing in the official codebase. A rudimentary encoder has already been written and is available on [https://github.com/justinruggles/FFmpeg-alsenc.git github]. For this project, that encoder is first to be updated to fit into the current codebase of FFmpeg and to be tested for conformance using the [http://www.nue.tu-berlin.de/menue/forschung/projekte/beendete_projekte/mpeg-4_audio_lossless_coding_als/parameter/en/#230252 reference codec and specifications]. Second, the encoder is to be brought through the usual reviewing process to hit the codebase at the end of the project.&lt;br /&gt;
&lt;br /&gt;
'''Expected results:'''&lt;br /&gt;
&lt;br /&gt;
* Update the existing encoder to fit into the current codebase.&lt;br /&gt;
* Ensure conformance of the encoder by verifying using the reference codec and generate a test case for FATE.&lt;br /&gt;
* Ensure the FFmpeg decoder processes all generated files without warnings.&lt;br /&gt;
* Enhance the rudimentary feature set of the encoder.&lt;br /&gt;
&lt;br /&gt;
'''Prerequisites:''' C coding skills, basic familiarity with git. A certain interest in audio coding and/or knowledge about the FFmpeg codebase could be beneficial.&lt;br /&gt;
&lt;br /&gt;
'''Qualification Task:''' Add floating point support to MPEG-4 ALS decoder&lt;br /&gt;
&lt;br /&gt;
'''Mentor:''' [[User:Pbm|Paul B Mahol]] (''durandal_1707'' in #ffmpeg-devel on Freenode IRC)&lt;br /&gt;
&lt;br /&gt;
'''Backup Mentor:''' TBA, possibly [[User:Stefanosa|Stefano Sabatini]] (''saste'' in #ffmpeg-devel on Freenode IRC)&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br clear=&amp;quot;all&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Hardware Acceleration API Software/Tracing Implementation ==&lt;br /&gt;
&lt;br /&gt;
'''Description:''' Our support for hardware accelerated decoding basically remains untested. This is in part due to FFmpeg only implementing part of the required steps and in part since it requires specific operating systems and hardware.&lt;br /&gt;
&lt;br /&gt;
The idea would be to start with a simple stub implementation of an API like e.g. VDPAU that provides only the most core functions. These would then serialize out the function calls and the data they get to allow for easy comparison and thus regression testing. Improvements to this approach are adding basic input validation and replay capability to allow testing regression data against real hardware. This would be similar to what apitrace https://github.com/apitrace/apitrace does for OpenGL.&lt;br /&gt;
&lt;br /&gt;
A further step would be to actually add support for decoding in software, so that full testing including visual inspection is possible without the need for special hardware.&lt;br /&gt;
&lt;br /&gt;
'''Prerequisites:''' C coding skills, basic familiarity with git&lt;br /&gt;
&lt;br /&gt;
'''Qualification Task:''' Anything related to the hardware acceleration code, though producing first ideas and code pieces for this task would also be reasonable&lt;br /&gt;
&lt;br /&gt;
'''Mentor:''' Reimar Döffinger (''reimar'' in #ffmpeg-devel on Freenode IRC, but since I'm rarely there better email me first: Reimar.Doeffinger [at] gmx.de)&lt;br /&gt;
&lt;br /&gt;
== Hardware Acceleration (hwaccel) API v2 ==&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div class=&amp;quot;floatright&amp;quot;&amp;gt;[[Image:Hardware.jpg]]&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
'''Description:''' FFmpeg supports hardware accelerated decoding through the internal hwaccel API. Currently supported system hardware acceleration APIs are VA-API (Linux), DXVA2 (Windows) and VDA (MacOS X). However, the current approach requires client applications to allocate the underlying resources (e.g. hardware surfaces and context) themselves, and handing them over to FFmpeg. This incurs a few limitations: this is not scalable to new codecs, i.e. this requires new tokens for each newly supported codec; this incurs extra work in the client application, which tends to be duplicated over several client applications; and this prevents efficient fallback to software decoding mode if the hardware cannot handle a particular codec specification.&lt;br /&gt;
&lt;br /&gt;
The goal of this project is to revamp the FFmpeg Hardware Acceleration API so that hardware resources are allocated and managed in the library, thus requiring the client application to only provide a single hardware context/device handle; provide a way to fallback early to software decoding mode if the underlying hardware won't be able to handle the bitstream; and make it possible to select a hardware accelerator by ID and not polluting the PixelFormats namespace.&lt;br /&gt;
&lt;br /&gt;
'''Expected results:'''&lt;br /&gt;
* FFmpeg core library (libavcodec):&lt;br /&gt;
** Core API extensions and improvements&lt;br /&gt;
*** Add open/close hooks in a way that is backwards compatible with hwaccel v1 enabled applications&lt;br /&gt;
*** Add new tokens describing hardware accelerators&lt;br /&gt;
*** Add new flags exposing HW capabilities like download/upload&lt;br /&gt;
*** Investigate the benefits or impacts to provide a global map/unmap capability to FFmpeg video buffers&lt;br /&gt;
** Port hwaccels to v2 infrastructure&lt;br /&gt;
*** Port VA-API decoders to v2 infrastructure&lt;br /&gt;
*** Validate that VA-API decoders still work with existing applications supporting hwaccel v1&lt;br /&gt;
*** Provide download capability through ''vaGetImage()''&lt;br /&gt;
*** Validate that ffplay can support this feature with minor changes, and definitely no change to the existing SDL renderer&lt;br /&gt;
*** Port VDPAU decoders to hwaccel v2 (optional), and investigate ways to preserve compatibility with older applications&lt;br /&gt;
&lt;br /&gt;
* FFmpeg applications:&lt;br /&gt;
** Integrate hardware acceleration into ffplay&lt;br /&gt;
*** Create a video-output (VO) infrastructure to ffplay&lt;br /&gt;
*** Port the SDL renderer to the new VO infrastructure&lt;br /&gt;
*** Add support for VA-API: VA renderer through ''vaPutSurface()'', add -hwaccel option to select &amp;quot;vaapi&amp;quot; renderer&lt;br /&gt;
*** Add support for VDPAU (optional): VDPAU renderer through ''VdpPresentationQueueDisplay()''&lt;br /&gt;
** Integrate hardware acceleration into ffmpeg&lt;br /&gt;
*** Add support for VA-API: use the VA/DRM API for headless (no-X display) decoding, use libudev to determine the device to use&lt;br /&gt;
&lt;br /&gt;
'''Prerequisites:''' C coding skills, basic familiarity with git, hardware supporting VA-API.&lt;br /&gt;
&lt;br /&gt;
'''Qualification Task:''' Anything related to the Hardware Acceleration (hwaccel) API, or to its related users. e.g. add JPEG decoding support with VA-API, etc.&lt;br /&gt;
&lt;br /&gt;
'''Mentor:''' TBA, possibly [[User:Gwenole_Beauchesne|Gwenole Beauchesne]] (''__gb__'' in #ffmpeg-devel on Freenode IRC)&lt;br /&gt;
&lt;br /&gt;
'''Backup Mentor:''' TBA&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br clear=&amp;quot;all&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Hardware Accelerated Video Encoding with VA-API ==&lt;br /&gt;
&lt;br /&gt;
'''Description:''' FFmpeg already supports hardware accelerated decoding for multiple codecs but still lacks support for hardware accelerated encoding. The aim of the project is to add support for encoding with VA-API specifically, while keeping a generic enough approach in mind so that other hardware accelerators (TI-DSP, CUDA?) could be supported as well. This means that new ''hwaccel'' hooks are needed and two operational modes are possible: either ''(i)'' driver or hardware pack headers themselves, or ''(ii)'' lattitude is left to perform this task at the FFmpeg library level.&lt;br /&gt;
&lt;br /&gt;
'''Expected results:''' Allow MPEG-2 and H.264 encoding with VA-API, while supporting variable bitrate (VBR) by default, and allowing alternate methods like constant bitrate (CBR) or constant QP (CQP) where appropriate or requested.&lt;br /&gt;
* MPEG-2 encoding:&lt;br /&gt;
** Add basic encoding with I/P frames (handle the ''-g'' option)&lt;br /&gt;
** Add support for B frames (handle the ''-bf'' option)&lt;br /&gt;
** Add support for constant bitrate (CBR, i.e. ''maxrate == bitrate'' and ''bufsize'' set)&lt;br /&gt;
** (Optionally) add support for interlaced contents&lt;br /&gt;
* H.264 encoding:&lt;br /&gt;
** Add basic encoding with I/P frames (handle the ''-g'' option)&lt;br /&gt;
** Add support for B frames (handle the ''-bf'' option)&lt;br /&gt;
** Add support for constant bitrate (CBR, i.e. ''maxrate == bitrate'' and ''bufsize'' set)&lt;br /&gt;
** Add support for constant QP (CQP, i.e. handle the ''-cqp'' option)&lt;br /&gt;
** Add support for more than one reference frame, while providing/using API to query the hardware capabilities&lt;br /&gt;
** Work on HRD conformance. May require to write an independent tool to assess that&lt;br /&gt;
** (Optionally) add configurability of the motion estimatation method to use. Define new types for HW accelerated encoding with at least two levels/hints for the accelerator.&lt;br /&gt;
* FFmpeg applications:&lt;br /&gt;
** Define common hwaccel interface for encoding&lt;br /&gt;
** Add initial support for hardware accelerated encoding to the ''ffmpeg'' application&lt;br /&gt;
&lt;br /&gt;
'''Prerequisites:''' C coding skills, basic familiarity with git, hardware supporting VA-API for encoding.&lt;br /&gt;
&lt;br /&gt;
'''Qualification Task:''' Anything related to the Hardware Acceleration (hwaccel) API, or to its related users. e.g. add JPEG decoding support with VA-API, etc.&lt;br /&gt;
&lt;br /&gt;
'''Mentor:''' TBA, possibly [[User:Gwenole_Beauchesne|Gwenole Beauchesne]] (''__gb__'' in #ffmpeg-devel on Freenode IRC)&lt;br /&gt;
&lt;br /&gt;
'''Backup Mentor:''' TBA, possibly Tushar Gohad&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br clear=&amp;quot;all&amp;quot;/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== AAC Improvements ==&lt;br /&gt;
&lt;br /&gt;
'''Description:''' FFmpeg contains an AAC encoder and decoder, both of them can be improved in various ways. This is enough work for more than one GSoC project, so one part of your submission would be to define on which task exactly you want to work.&lt;br /&gt;
* AAC BSAC decoder: This has already been started, but the existing decoder still fails on many samples&lt;br /&gt;
* AAC SSR decoder&lt;br /&gt;
* AAC 960/120 MDCT window&lt;br /&gt;
&lt;br /&gt;
'''Qualification Task:''' See the FFmpeg bug tracker for AAC issues, fixing one of them or rebasing the existing incomplete BSAC decoder for current git head or fixing one or more existing bugs are possible qualification tasks.&lt;br /&gt;
&lt;br /&gt;
'''Prerequisites:''' C coding skills, basic familiarity with git, knowledge about transform based audio coding would be useful.&lt;br /&gt;
&lt;br /&gt;
'''Mentor:''' Baptiste Coudurier (''bcoudurier'' in #ffmpeg-devel on Freenode IRC)&lt;br /&gt;
&lt;br /&gt;
'''Backup Mentor:''' TBA, possibly [[User:Stefanosa|Stefano Sabatini]] (''saste'' in #ffmpeg-devel on Freenode IRC)&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br clear=&amp;quot;all&amp;quot;/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== DTS / DCA Improvements ==&lt;br /&gt;
&lt;br /&gt;
'''Description:''' FFmpeg contains a DTS decoder.&lt;br /&gt;
* DTS-HD decoder improvements: A possible qualification task is to implement ticket [https://trac.ffmpeg.org/ticket/1920 #1920]&lt;br /&gt;
** Add support for X96 extension (96khz)&lt;br /&gt;
** Add support for XLL extension (lossless)&lt;br /&gt;
** Add support for pure DTS-HD streams that do not contain a DTS core&lt;br /&gt;
** Add support for multiple assets&lt;br /&gt;
** Add support for LBR extension&lt;br /&gt;
&lt;br /&gt;
'''Prerequisites:''' C coding skills, basic familiarity with git.&lt;br /&gt;
&lt;br /&gt;
'''Mentor:''' Benjamin Larsson (''merbanan'' in #ffmpeg-devel on Freenode IRC)&lt;br /&gt;
&lt;br /&gt;
'''Backup Mentor:''' TBA, possibly [[User:Stefanosa|Stefano Sabatini]] (''saste'' in #ffmpeg-devel on Freenode IRC)&lt;br /&gt;
&lt;br /&gt;
== MXF Demuxer Improvements ==&lt;br /&gt;
&lt;br /&gt;
'''Description:''' FFmpeg's MXF demuxer needs a proper, compact way to map EssenceContainer? ULs to WrappingKind?. See ticket #2776. I wrote stuff in ticket #1916 which is also relevant.&lt;br /&gt;
&lt;br /&gt;
The gist of this is that essence in MXF is typically stored in one of two ways: as an audio/video interleave or with each stream in one huge chunk (like 1 GiB audio followed by 10 GiB video). Previous ways of telling these apart have been technically wrong, but has worked due to a lack of samples demonstrating the contrary.&lt;br /&gt;
&lt;br /&gt;
'''Expected results:''' The sample in ticket #2776 demuxes fine and there's a test case in FATE for it. The solution should grow libavformat by no more than 32 KiB.&lt;br /&gt;
&lt;br /&gt;
'''Prerequisites:''' C coding skills, basic familiarity with git.&lt;br /&gt;
&lt;br /&gt;
'''Qualification Task:''' Investigate if there may be a compact way of representing the UL -&amp;gt; WrappingKind? mapping specified in the official RP224 Excel document. The tables takes up about half a megabyte verbatim, which is unacceptable in a library as large as libavformat.&lt;br /&gt;
&lt;br /&gt;
'''Mentor:''' TBA, possibly Tomas Härdin (''thardin'' in #ffmpeg-devel on Freenode IRC)&lt;br /&gt;
&lt;br /&gt;
'''Backup Mentor:''' TBA&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br clear=&amp;quot;all&amp;quot;/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
= Unmentored Projects =&lt;br /&gt;
&lt;br /&gt;
This is a list of projects that students are encouraged to consider if a mentored project is unavailable or not within the students skill or interests. The student will have to find a mentor for the project. A student can also [[#Your_Own_Idea|propose their own project]].&lt;br /&gt;
&lt;br /&gt;
== glplay ==&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div class=&amp;quot;floatleft&amp;quot;&amp;gt;[[Image:Opengl_logo.jpg]]&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
'''Description:''' The SDL library that is used by FFplay has some deficiencies, adding OpenGL output to FFplay should allow for better performance (and less bugs at least for some hardware / driver combinations). This could be a new application (glplay), but it is probably simpler to extend ffplay to use OpenGL. You can use code from MPlayer's OpenGL vo module which may be relicensed under the LGPL.&lt;br /&gt;
&lt;br /&gt;
'''Mentor:''' TBD Backup: Reimar Döffinger&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br clear=&amp;quot;all&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== TrueHD encoder ==&lt;br /&gt;
&lt;br /&gt;
'''Description:''' FFmpeg currently does not support encoding to one of the lossless audio formats used on Bluray discs. This task consists of implementing a TrueHD encoder that allows to losslessly encode audio to play it on hardware devices capable of TrueHD decoding.&lt;br /&gt;
&lt;br /&gt;
== Opus decoder ==&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div class=&amp;quot;floatright&amp;quot;&amp;gt;[[Image:Opus.png]]&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
'''Description:''' Opus decoding is currently supported through the external libopus library&lt;br /&gt;
* Write a native decoder, continue working on the existing unfinished implementation&lt;br /&gt;
A possible qualification task is to port the existing incomplete decoder to current git head and improve it to show that you are capable of working on this task.&lt;br /&gt;
&lt;br /&gt;
== VC-1 interlaced ==&lt;br /&gt;
&lt;br /&gt;
'''Description:''' The FFmpeg VC-1 decoder has improved over the years, but many samples are still not decoded bit-exact and real-world interlaced streams typically show artefacts.&lt;br /&gt;
* Implement missing interlace features&lt;br /&gt;
* Make more reference samples bit-exact&lt;br /&gt;
As a qualification task, you should try to find a bug in the current decoder implementation and fix it.&lt;br /&gt;
&lt;br /&gt;
== JPEG 2000 ==&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div class=&amp;quot;floatleft&amp;quot;&amp;gt;[[Image:Jpeg2000.jpg]]&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
'''Description:''' FFmpeg contains an experimental native JPEG 2000 encoder and decoder. Both are missing many features, see also the FFmpeg bug tracker for some unsupported samples.&lt;br /&gt;
Work on an issue (for example from the bug tracker) as a qualification task to show that you are capable of improving the codec implementation.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br clear=&amp;quot;all&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== VP8L ==&lt;br /&gt;
&lt;br /&gt;
'''Description:''' [[VP8L]] is a lossless format used in WebP. There is no support for this in FFmpeg.&lt;br /&gt;
&lt;br /&gt;
== Your Own Project Idea ==&lt;br /&gt;
&lt;br /&gt;
A student can propose a project. Ideas can also be found by browsing bugs and feature requests on our [https://trac.ffmpeg.org/ bug tracker]. The work should last the majority of the GSoC duration, the task must be approved by the developers, and a mentor must be assigned.&lt;br /&gt;
&lt;br /&gt;
Students can discuss an idea in the [http://ffmpeg.org/mailman/listinfo/ffmpeg-devel ffmpeg-devel mailing-list], the #ffmpeg-devel IRC channel, or contact the FFmpeg GSoC admins for more information.&lt;br /&gt;
&lt;br /&gt;
[[Category:FFmpeg]]&lt;/div&gt;</summary>
		<author><name>Reimar</name></author>
	</entry>
	<entry>
		<id>https://wiki.multimedia.cx/index.php?title=FFmpeg_Summer_of_Code_2014&amp;diff=14952</id>
		<title>FFmpeg Summer of Code 2014</title>
		<link rel="alternate" type="text/html" href="https://wiki.multimedia.cx/index.php?title=FFmpeg_Summer_of_Code_2014&amp;diff=14952"/>
		<updated>2014-02-05T18:32:51Z</updated>

		<summary type="html">&lt;p&gt;Reimar: Add Hardware Acceleration API Software/Tracing Implementation&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This page is on wiki.multimedia.cx due to trac.ffmpeg.org being down for maintaince. It might be moved back later.&lt;br /&gt;
&lt;br /&gt;
= Introduction =&lt;br /&gt;
&lt;br /&gt;
FFmpeg is the universal multimedia toolkit: a complete, cross-platform solution to record, convert, filter and stream audio and video. It includes libavcodec - the leading audio/video codec library.&lt;br /&gt;
&lt;br /&gt;
[https://developers.google.com/open-source/soc/ Google Summer of Code (GSoC)] is a program that offers students stipends to write code for open source projects. Through the guidance of mentors, students gain valuable experience interacting with and coding for open source projects like FFmpeg. Additionally, the project and its users benefit from code created from students who often continue contributing as developers. FFmpeg participated to several past editions ([[FFmpeg Summer Of Code 2006|2006]], [[FFmpeg Summer Of Code 2007|2007]], [[FFmpeg Summer Of Code 2008|2008]], [[FFmpeg Summer Of Code 2009|2009]], [[FFmpeg Summer Of Code 2010|2010]], and [[FFmpeg / Libav Summer Of Code 2011|2011]]), and we are looking forward to being involved this year.&lt;br /&gt;
&lt;br /&gt;
This is our ideas page for [http://www.google-melange.com/gsoc/homepage/google/gsoc2014 Google Summer of Code 2014]. See the GSoC Timeline for important dates.&lt;br /&gt;
&lt;br /&gt;
== Information for Students ==&lt;br /&gt;
&lt;br /&gt;
=== Getting Started ===&lt;br /&gt;
&lt;br /&gt;
0. '''Get to know FFmpeg.''' If you are a student and interested in contributing to an FFmpeg GSoC project it is recommended to start by subscribing to the [http://ffmpeg.org/mailman/listinfo/ffmpeg-devel ffmpeg-devel] mailing-list, visiting our IRC channels (''#ffmpeg-devel'' and ''#ffmpeg''), and exploring the codebase and the development workflow. Feel free to [[#Contacting_FFmpeg|contact us]] if you have any questions.&lt;br /&gt;
&lt;br /&gt;
1. '''Find a project.''' Listed on this page are mentored and unmentored projects. Mentored projects are well-defined and mentor(s) have already volunteered. Unmentored projects are additional ideas that you may consider, but you will have to contact us to find a mentor. You may also propose your own project that may be a better match for your interest and skill level.&lt;br /&gt;
&lt;br /&gt;
2. '''Contact us.''' If you find a project that you are interested in then get in touch with the community and let us know. In case you want to work on a qualification task, you should ask the respective mentor(s) so that the task can be claimed.&lt;br /&gt;
&lt;br /&gt;
3. '''Apply.''' Student proposal period begins 10 March 19:00 UTC and ends 21 March 19:00 UTC. See the See the [http://www.google-melange.com/gsoc/document/show/gsoc_program/google/gsoc2014/help_page#2._What_is_the_program_timeline GSoC timeline] for additional information.&lt;br /&gt;
&lt;br /&gt;
=== Qualification Tasks ===&lt;br /&gt;
&lt;br /&gt;
In order to get accepted you will be requested to complete a small task in the area you want to contribute. FFmpeg GSoC projects can be challenging, and a qualification task will show us that you are motivated and have the potential to successfully finish a project.&lt;br /&gt;
&lt;br /&gt;
The qualification task is usually shown in the project description. Contact the respective mentor(s) for assistance on getting a related qualification task or if you want to propose your own. Browse the [https://trac.ffmpeg.org FFmpeg Bug Tracker] for qualification task ideas.&lt;br /&gt;
&lt;br /&gt;
=== Contacting FFmpeg ===&lt;br /&gt;
&lt;br /&gt;
If you have questions or comments feel free to contact us via our mailing list, IRC channel, or e-mail one of the FFmpeg GSoC admins:&lt;br /&gt;
&lt;br /&gt;
* '''Mailing-list:''' [http://ffmpeg.org/mailman/listinfo/ffmpeg-devel ffmpeg-devel]&lt;br /&gt;
* '''IRC:''' ''#ffmpeg-devel'' on Freenode&lt;br /&gt;
* '''FFmpeg GSoC Admins:''' TBA&lt;br /&gt;
&lt;br /&gt;
You can also contact a mentor directly if you have questions specifically related to one of the projects listed on this page.&lt;br /&gt;
&lt;br /&gt;
= Mentored Projects =&lt;br /&gt;
&lt;br /&gt;
This section lists well-defined projects that have one or more available mentors. If you are new to FFmpeg, and have relatively little experience with multimedia, you should favor a mentored project rather than propose your own. Contact the respective mentor(s) to get more information about the project and the requested qualification task.&lt;br /&gt;
&lt;br /&gt;
== H.264 Multiview Video Coding (MVC) ==&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div class=&amp;quot;floatright&amp;quot;&amp;gt;[[Image:Mmspg-epfl-ch-double-camera.jpg]]&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
'''Description:''' MVC samples exist and the codec is used on Blu-ray media, but FFmpeg is missing a decoder. Since this project also consists of some changes in the current architecture, it is especially important that this project is discussed on the [http://ffmpeg.org/mailman/listinfo/ffmpeg-devel ffmpeg-devel mailing list].&lt;br /&gt;
&lt;br /&gt;
'''Expected results:''' Create MVC decoder and add a test for the FFmpeg Automated Testing Environment (FATE).&lt;br /&gt;
&lt;br /&gt;
'''Prerequisites:''' C coding skills, basic familiarity with git.&lt;br /&gt;
&lt;br /&gt;
'''Qualification Task:''' Perform work that demonstrates understanding of MVC and that is a subpart of the whole MVC implementation.&lt;br /&gt;
&lt;br /&gt;
'''Mentor:''' TBA, possibly [[User:Michael|Michael Niedermayer]] (''michaelni'' in #ffmpeg-devel on Freenode IRC)&lt;br /&gt;
&lt;br /&gt;
'''Backup mentor:''' TBA, possibly Kieran Kunhya (''kierank'' in #ffmpeg-devel on Freenode IRC)&lt;br /&gt;
&lt;br /&gt;
== Animated Portable Network Graphics (APNG) ==&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div class=&amp;quot;floatright&amp;quot;&amp;gt;[[Image:Animated PNG example bouncing beach ball.png]]&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
'''Description:''' FFmpeg currently does not support Animated PNGs, the goal of this project is to change that and add support. The little bouncing ball animation shown to the right is such a APNG file.&lt;br /&gt;
&lt;br /&gt;
'''Specification:''' https://wiki.mozilla.org/APNG_Specification&lt;br /&gt;
&lt;br /&gt;
'''Expected results:'''&lt;br /&gt;
* APNG demuxer&lt;br /&gt;
** implement robust probing:&lt;br /&gt;
*** PNG images are not misdetected as APNG animations&lt;br /&gt;
*** APNG animations are not misdetected as PNG images&lt;br /&gt;
** splits stream into sensible packets (so they can be easily reused in APNG muxer)&lt;br /&gt;
** survives fuzzing (zzuf)&lt;br /&gt;
** add FATE coverage, coverage should be at least 70%&lt;br /&gt;
** test code under valgrind so no invalid reads/writes happen&lt;br /&gt;
&lt;br /&gt;
* APNG decoder&lt;br /&gt;
** use existing PNG decoder code (write decoder in same file)&lt;br /&gt;
** implement parsing of all APNG chunks (acTL, fcTL, fdAT)&lt;br /&gt;
** error handling&lt;br /&gt;
** survives fuzzing (zzuf)&lt;br /&gt;
** add test for FATE, coverage should be at least 75%&lt;br /&gt;
** CRC checksum validation&lt;br /&gt;
** test code under valgrind so no invalid reads/writes happen&lt;br /&gt;
&lt;br /&gt;
* APNG muxer &amp;amp;&amp;amp; APNG encoder&lt;br /&gt;
** use existing PNG encoder code (write encoder in same file)&lt;br /&gt;
** write compliant files, make sure they play correctly in major web browsers that support APNG&lt;br /&gt;
** add test for FATE&lt;br /&gt;
&lt;br /&gt;
'''Prerequisites:''' C coding skills, basic familiarity with git.&lt;br /&gt;
&lt;br /&gt;
'''Qualification Task:''' Implement format autodetection for imagepipe and image demuxer.&lt;br /&gt;
&lt;br /&gt;
'''Mentor:''' [[User:Pbm|Paul B Mahol]] (''durandal_1707'' in #ffmpeg-devel on Freenode IRC)&lt;br /&gt;
&lt;br /&gt;
'''Backup mentor:''' TBA, possibly [[User:Suxen_drol|Peter Ross]] (''pross-au'' in #ffmpeg-devel on Freenode IRC)&lt;br /&gt;
&lt;br /&gt;
== FFv1 P frame support ==&lt;br /&gt;
&lt;br /&gt;
'''Description:''' FFv1 is one of the most efficient intra-only lossless video codecs. Your work will be to add support for P frames with motion compensation and motion estimation support (the existing motion estimation code in libavcodec can be reused here). Then fine-tune it until the best compression rate is achieved. This will make FFv1 competitive with existing I+P frame lossless codecs like lossless H.264.&lt;br /&gt;
&lt;br /&gt;
'''Expected results:''' State of the art P frame support in the FFv1 encoder and decoder implementation.&lt;br /&gt;
&lt;br /&gt;
'''Prerequisites:''' C coding skills, basic familiarity with git, solid understanding of video coding especially with motion compensation.&lt;br /&gt;
&lt;br /&gt;
'''Qualification Task:''' Implement support for simple P frames without motion compensation in FFv1. That is so that each frame stores the difference to the previous frame.&lt;br /&gt;
&lt;br /&gt;
'''Mentor:''' [[User:Michael|Michael Niedermayer]] (''michaelni'' in #ffmpeg-devel on Freenode IRC)&lt;br /&gt;
&lt;br /&gt;
'''Backup mentor:''' TBA&lt;br /&gt;
&lt;br /&gt;
== Misc Libavfilter extension ==&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div class=&amp;quot;floatright&amp;quot;&amp;gt;[[Image:Lavfi-gsoc-filter-vintage-illustration.jpg]]&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
'''Description:''' Libavfilter is the FFmpeg filtering library. It currently supports audio and video filtering and generation support. This work may focus on porting, fixing, extending, or writing new audio and video filters from scratch.&lt;br /&gt;
&lt;br /&gt;
Candidate filters for porting may be the remaining MPlayer filters currently supported through the mp wrapper, libaf MPlayer filters, and filters from other frameworks (e.g. mjpegtools, transcode, avisynth, virtualdub, etc.). In case of mp ports, the student should verify that the new filter produces the same output and is not slower.&lt;br /&gt;
&lt;br /&gt;
Some ideas for more filters:&lt;br /&gt;
* a frequency filtering domain filter relying on the FFT utils in libavcodec&lt;br /&gt;
* a controller filter which allows to send commands to other filters (e.g. to adjust volume, contrast, etc.), e.g. like the sendcmd filter but through an interactive GUI&lt;br /&gt;
* a lua scripting filter, which allows to implement filtering custom logic in lua&lt;br /&gt;
&lt;br /&gt;
For more ideas check [https://trac.ffmpeg.org/query?status=new&amp;amp;status=open&amp;amp;status=reopened&amp;amp;component=avfilter&amp;amp;col=id&amp;amp;col=summary&amp;amp;col=status&amp;amp;col=type&amp;amp;col=priority&amp;amp;col=component&amp;amp;col=version&amp;amp;order=priority trac libavfilter tickets].&lt;br /&gt;
&lt;br /&gt;
'''Expected results:''' Write or port audio and video filters and possibly fix/extend libavfilter API and design when required.&lt;br /&gt;
&lt;br /&gt;
'''Prerequisites:''' C coding skills, basic familiarity with git. Some background on DSP and image/sound processing techniques would be a bonus but is not strictly required.&lt;br /&gt;
&lt;br /&gt;
'''Qualification Task:''' write or port one or more filters&lt;br /&gt;
&lt;br /&gt;
'''Mentor:''' TBA, possibly [[User:Stefanosa|Stefano Sabatini]] (''saste'' in #ffmpeg-devel on Freenode IRC)&lt;br /&gt;
&lt;br /&gt;
'''Backup mentor:''' [[User:Ubitux|Clément Bœsch]] (''ubitux'' in #ffmpeg-devel on Freenode IRC)&lt;br /&gt;
&lt;br /&gt;
== Subtitles ==&lt;br /&gt;
&lt;br /&gt;
'''Description:''' FFmpeg has been working on improving its subtitles support recently, notably by adding the support for various text subtitles and various hardsubbing (burning the subtitles onto the video) facilities. While the theme may sound relatively simple compared to audio/video signal processing, the project carries an historical burden not easy to deal with, and introduces various issues very specific to its sparse form.&lt;br /&gt;
&lt;br /&gt;
'''Expected results:'''&lt;br /&gt;
* Add support for new subtitles formats. Example: a demuxer for .SUP files, just like VobSub but for Blu-Ray, or a VobSub muxer.&lt;br /&gt;
* Improve text subtitles decoders. Typically, this can be supporting advanced markup features in SAMI or WebVTT.&lt;br /&gt;
* Update the API to get rid of the clumsy internal text representation of styles&lt;br /&gt;
* Proper integration of subtitles into libavfilter. This is the ultimate goal, as it will notably allow a complete subtitles rendering for applications such as ffplay.&lt;br /&gt;
* BONUS: if everything goes well, the student will be allowed to add basic support for teletext&lt;br /&gt;
&lt;br /&gt;
'''Prerequisites:''' C coding skills, basic familiarity with git. Some background in fansubbing area (notably ASS experience) would be a bonus but is not strictly required.&lt;br /&gt;
&lt;br /&gt;
'''Qualification Task:''' write one subtitles demuxer and decoder (for example support for Spruce subtitles format). This is in order to make sure the subtitles chain is understood.&lt;br /&gt;
&lt;br /&gt;
'''Mentor:''' [[User:Ubitux|Clément Bœsch]] (''ubitux'' in #ffmpeg-devel on Freenode IRC)&lt;br /&gt;
&lt;br /&gt;
'''Backup Mentor:''' TBA, possibly Nicolas George (''Cigaes'' in #ffmpeg-devel on Freenode IRC)&lt;br /&gt;
&lt;br /&gt;
== Postproc optimizations ==&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div class=&amp;quot;floatright&amp;quot;&amp;gt;[[Image:PostProc.jpg]]&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
'''Description:''' FFmpeg contains libpostproc, which is used to postprocess 8x8 DCT-MC based video and images (jpeg, mpeg-1/2/4, H.263 among others). Postprocessing removes blocking (and other) artifacts from low bitrate / low quality images and videos. The code though has been written a long time ago and its SIMD optimizations need to be updated to what modern CPUs support (AVX2 and SSE2+).&lt;br /&gt;
&lt;br /&gt;
'''Expected results:'''&lt;br /&gt;
* Convert all gcc inline asm in libpostproc to YASM.&lt;br /&gt;
* Restructure the code so that it works with block sizes compatible with modern SIMD.&lt;br /&gt;
* Add Integer SSE2 and AVX2 optimizations for each existing MMX/MMX2/3dnow optimization in libpostproc.&lt;br /&gt;
&lt;br /&gt;
'''Prerequisites:''' C coding skills, good x86 assembly coding skills, basic familiarity with git.&lt;br /&gt;
&lt;br /&gt;
'''Qualification Task:''' convert 1 or 2 MMX2 functions to SSE2 and AVX2.&lt;br /&gt;
&lt;br /&gt;
'''Mentor:''' [[User:Michael|Michael Niedermayer]] (''michaelni'' in #ffmpeg-devel on Freenode IRC)&lt;br /&gt;
&lt;br /&gt;
'''Backup Mentor:''' TBA, possibly [[User:Stefanosa|Stefano Sabatini]] (''saste'' in #ffmpeg-devel on Freenode IRC)&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br clear=&amp;quot;all&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Bayer RGB colorspaces ==&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div class=&amp;quot;floatright&amp;quot;&amp;gt;[[Image:350px-Bayer_pattern_on_sensor.svg.png ]]&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
'''Description:''' Several image and video format store pixels using Bayer-pattern colorspaces. Supporting these format would broaden FFmpeg's applicability to RAW still and video photography processing.&lt;br /&gt;
&lt;br /&gt;
'''Expected results:'''&lt;br /&gt;
* Rebase existing patches&lt;br /&gt;
* Implement high quality bayer transformations in libswscale (plain C)&lt;br /&gt;
* Add bayer formats to the libavutil pixfmt enumeration routines&lt;br /&gt;
* SIMD optimizations of the libswscale transformations&lt;br /&gt;
* Complete PhotoCINE demuxer to support Bayer format; (or another format of your choosing)&lt;br /&gt;
&lt;br /&gt;
Optional goodies:&lt;br /&gt;
* Extend TIFF decoder to support DNG-Bayer format&lt;br /&gt;
* Support a popular proprietary camera format (many to choose from; see dcraw project)&lt;br /&gt;
&lt;br /&gt;
'''Prerequisites:''' C coding skills, basic familiarity with git.&lt;br /&gt;
&lt;br /&gt;
'''Qualification Task:''' Implement a simple and working Bayer-&amp;gt;RGB transform in libswscale&lt;br /&gt;
&lt;br /&gt;
'''Mentor:''' TBA, possibly [[User:Suxen_drol|Peter Ross]] (''pross-au'' in #ffmpeg-devel on Freenode IRC)&lt;br /&gt;
&lt;br /&gt;
'''Backup Mentor:''' [[User:Michael|Michael Niedermayer]] (''michaelni'' in #ffmpeg-devel on Freenode IRC)&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br clear=&amp;quot;all&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== MPEG-4 ALS encoder ==&lt;br /&gt;
&lt;br /&gt;
'''Description:'''&lt;br /&gt;
A MPEG-4 ALS decoder was implemented several years ago but an encoder is still missing in the official codebase. A rudimentary encoder has already been written and is available on [https://github.com/justinruggles/FFmpeg-alsenc.git github]. For this project, that encoder is first to be updated to fit into the current codebase of FFmpeg and to be tested for conformance using the [http://www.nue.tu-berlin.de/menue/forschung/projekte/beendete_projekte/mpeg-4_audio_lossless_coding_als/parameter/en/#230252 reference codec and specifications]. Second, the encoder is to be brought through the usual reviewing process to hit the codebase at the end of the project.&lt;br /&gt;
&lt;br /&gt;
'''Expected results:'''&lt;br /&gt;
&lt;br /&gt;
* Update the existing encoder to fit into the current codebase.&lt;br /&gt;
* Ensure conformance of the encoder by verifying using the reference codec and generate a test case for FATE.&lt;br /&gt;
* Ensure the FFmpeg decoder processes all generated files without warnings.&lt;br /&gt;
* Enhance the rudimentary feature set of the encoder.&lt;br /&gt;
&lt;br /&gt;
'''Prerequisites:''' C coding skills, basic familiarity with git. A certain interest in audio coding and/or knowledge about the FFmpeg codebase could be beneficial.&lt;br /&gt;
&lt;br /&gt;
'''Qualification Task:''' Add floating point support to MPEG-4 ALS decoder&lt;br /&gt;
&lt;br /&gt;
'''Mentor:''' [[User:Pbm|Paul B Mahol]] (''durandal_1707'' in #ffmpeg-devel on Freenode IRC)&lt;br /&gt;
&lt;br /&gt;
'''Backup Mentor:''' TBA, possibly [[User:Stefanosa|Stefano Sabatini]] (''saste'' in #ffmpeg-devel on Freenode IRC)&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br clear=&amp;quot;all&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Hardware Acceleration API Software/Tracing Implementation ==&lt;br /&gt;
&lt;br /&gt;
'''Description:''' Our support for hardware accelerated decoding basically remains untested. This is in part due to FFmpeg only implementing part of the required steps and in part since it requires specific operating systems and hardware.&lt;br /&gt;
&lt;br /&gt;
The idea would be to start with a simple stub implementation of an API like e.g. VDPAU that provides only the most core functions. These would then serialize out the function calls and the data they get to allow for easy comparison and thus regression testing. Improvements to this approach are adding basic input validation and replay capability to allow testing regression data against real hardware. This would be similar to what apitrace https://github.com/apitrace/apitrace does for OpenGL.&lt;br /&gt;
&lt;br /&gt;
A further step would be to actually add support for decoding in software, so that full testing including visual inspection is possible without the need for special hardware.&lt;br /&gt;
&lt;br /&gt;
'''Prerequisites:''' C coding skills, basic familiarity with git&lt;br /&gt;
&lt;br /&gt;
'''Qualification Task:''' Anything related to the hardware acceleration code, though producing first ideas and code pieces for this task would also be reasonable&lt;br /&gt;
&lt;br /&gt;
'''Mentor:''' Reimar Döffinger&lt;br /&gt;
&lt;br /&gt;
== Hardware Acceleration (hwaccel) API v2 ==&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div class=&amp;quot;floatright&amp;quot;&amp;gt;[[Image:Hardware.jpg]]&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
'''Description:''' FFmpeg supports hardware accelerated decoding through the internal hwaccel API. Currently supported system hardware acceleration APIs are VA-API (Linux), DXVA2 (Windows) and VDA (MacOS X). However, the current approach requires client applications to allocate the underlying resources (e.g. hardware surfaces and context) themselves, and handing them over to FFmpeg. This incurs a few limitations: this is not scalable to new codecs, i.e. this requires new tokens for each newly supported codec; this incurs extra work in the client application, which tends to be duplicated over several client applications; and this prevents efficient fallback to software decoding mode if the hardware cannot handle a particular codec specification.&lt;br /&gt;
&lt;br /&gt;
The goal of this project is to revamp the FFmpeg Hardware Acceleration API so that hardware resources are allocated and managed in the library, thus requiring the client application to only provide a single hardware context/device handle; provide a way to fallback early to software decoding mode if the underlying hardware won't be able to handle the bitstream; and make it possible to select a hardware accelerator by ID and not polluting the PixelFormats namespace.&lt;br /&gt;
&lt;br /&gt;
'''Expected results:'''&lt;br /&gt;
* FFmpeg core library (libavcodec):&lt;br /&gt;
** Core API extensions and improvements&lt;br /&gt;
*** Add open/close hooks in a way that is backwards compatible with hwaccel v1 enabled applications&lt;br /&gt;
*** Add new tokens describing hardware accelerators&lt;br /&gt;
*** Add new flags exposing HW capabilities like download/upload&lt;br /&gt;
*** Investigate the benefits or impacts to provide a global map/unmap capability to FFmpeg video buffers&lt;br /&gt;
** Port hwaccels to v2 infrastructure&lt;br /&gt;
*** Port VA-API decoders to v2 infrastructure&lt;br /&gt;
*** Validate that VA-API decoders still work with existing applications supporting hwaccel v1&lt;br /&gt;
*** Provide download capability through ''vaGetImage()''&lt;br /&gt;
*** Validate that ffplay can support this feature with minor changes, and definitely no change to the existing SDL renderer&lt;br /&gt;
*** Port VDPAU decoders to hwaccel v2 (optional), and investigate ways to preserve compatibility with older applications&lt;br /&gt;
&lt;br /&gt;
* FFmpeg applications:&lt;br /&gt;
** Integrate hardware acceleration into ffplay&lt;br /&gt;
*** Create a video-output (VO) infrastructure to ffplay&lt;br /&gt;
*** Port the SDL renderer to the new VO infrastructure&lt;br /&gt;
*** Add support for VA-API: VA renderer through ''vaPutSurface()'', add -hwaccel option to select &amp;quot;vaapi&amp;quot; renderer&lt;br /&gt;
*** Add support for VDPAU (optional): VDPAU renderer through ''VdpPresentationQueueDisplay()''&lt;br /&gt;
** Integrate hardware acceleration into ffmpeg&lt;br /&gt;
*** Add support for VA-API: use the VA/DRM API for headless (no-X display) decoding, use libudev to determine the device to use&lt;br /&gt;
&lt;br /&gt;
'''Prerequisites:''' C coding skills, basic familiarity with git, hardware supporting VA-API.&lt;br /&gt;
&lt;br /&gt;
'''Qualification Task:''' Anything related to the Hardware Acceleration (hwaccel) API, or to its related users. e.g. add JPEG decoding support with VA-API, etc.&lt;br /&gt;
&lt;br /&gt;
'''Mentor:''' TBA, possibly [[User:Gwenole_Beauchesne|Gwenole Beauchesne]] (''__gb__'' in #ffmpeg-devel on Freenode IRC)&lt;br /&gt;
&lt;br /&gt;
'''Backup Mentor:''' TBA&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br clear=&amp;quot;all&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Hardware Accelerated Video Encoding with VA-API ==&lt;br /&gt;
&lt;br /&gt;
'''Description:''' FFmpeg already supports hardware accelerated decoding for multiple codecs but still lacks support for hardware accelerated encoding. The aim of the project is to add support for encoding with VA-API specifically, while keeping a generic enough approach in mind so that other hardware accelerators (TI-DSP, CUDA?) could be supported as well. This means that new ''hwaccel'' hooks are needed and two operational modes are possible: either ''(i)'' driver or hardware pack headers themselves, or ''(ii)'' lattitude is left to perform this task at the FFmpeg library level.&lt;br /&gt;
&lt;br /&gt;
'''Expected results:''' Allow MPEG-2 and H.264 encoding with VA-API, while supporting variable bitrate (VBR) by default, and allowing alternate methods like constant bitrate (CBR) or constant QP (CQP) where appropriate or requested.&lt;br /&gt;
* MPEG-2 encoding:&lt;br /&gt;
** Add basic encoding with I/P frames (handle the ''-g'' option)&lt;br /&gt;
** Add support for B frames (handle the ''-bf'' option)&lt;br /&gt;
** Add support for constant bitrate (CBR, i.e. ''maxrate == bitrate'' and ''bufsize'' set)&lt;br /&gt;
** (Optionally) add support for interlaced contents&lt;br /&gt;
* H.264 encoding:&lt;br /&gt;
** Add basic encoding with I/P frames (handle the ''-g'' option)&lt;br /&gt;
** Add support for B frames (handle the ''-bf'' option)&lt;br /&gt;
** Add support for constant bitrate (CBR, i.e. ''maxrate == bitrate'' and ''bufsize'' set)&lt;br /&gt;
** Add support for constant QP (CQP, i.e. handle the ''-cqp'' option)&lt;br /&gt;
** Add support for more than one reference frame, while providing/using API to query the hardware capabilities&lt;br /&gt;
** Work on HRD conformance. May require to write an independent tool to assess that&lt;br /&gt;
** (Optionally) add configurability of the motion estimatation method to use. Define new types for HW accelerated encoding with at least two levels/hints for the accelerator.&lt;br /&gt;
* FFmpeg applications:&lt;br /&gt;
** Define common hwaccel interface for encoding&lt;br /&gt;
** Add initial support for hardware accelerated encoding to the ''ffmpeg'' application&lt;br /&gt;
&lt;br /&gt;
'''Prerequisites:''' C coding skills, basic familiarity with git, hardware supporting VA-API for encoding.&lt;br /&gt;
&lt;br /&gt;
'''Qualification Task:''' Anything related to the Hardware Acceleration (hwaccel) API, or to its related users. e.g. add JPEG decoding support with VA-API, etc.&lt;br /&gt;
&lt;br /&gt;
'''Mentor:''' TBA, possibly [[User:Gwenole_Beauchesne|Gwenole Beauchesne]] (''__gb__'' in #ffmpeg-devel on Freenode IRC)&lt;br /&gt;
&lt;br /&gt;
'''Backup Mentor:''' TBA, possibly Tushar Gohad&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br clear=&amp;quot;all&amp;quot;/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== AAC Improvements ==&lt;br /&gt;
&lt;br /&gt;
'''Description:''' FFmpeg contains an AAC encoder and decoder, both of them can be improved in various ways. This is enough work for more than one GSoC project, so one part of your submission would be to define on which task exactly you want to work.&lt;br /&gt;
* AAC BSAC decoder: This has already been started, but the existing decoder still fails on many samples&lt;br /&gt;
* AAC SSR decoder&lt;br /&gt;
* AAC 960/120 MDCT window&lt;br /&gt;
&lt;br /&gt;
'''Qualification Task:''' See the FFmpeg bug tracker for AAC issues, fixing one of them or rebasing the existing incomplete BSAC decoder for current git head or fixing one or more existing bugs are possible qualification tasks.&lt;br /&gt;
&lt;br /&gt;
'''Prerequisites:''' C coding skills, basic familiarity with git, knowledge about transform based audio coding would be useful.&lt;br /&gt;
&lt;br /&gt;
'''Mentor:''' Baptiste Coudurier (''bcoudurier'' in #ffmpeg-devel on Freenode IRC)&lt;br /&gt;
&lt;br /&gt;
'''Backup Mentor:''' TBA, possibly [[User:Stefanosa|Stefano Sabatini]] (''saste'' in #ffmpeg-devel on Freenode IRC)&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br clear=&amp;quot;all&amp;quot;/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== DTS / DCA Improvements ==&lt;br /&gt;
&lt;br /&gt;
'''Description:''' FFmpeg contains a DTS decoder.&lt;br /&gt;
* DTS-HD decoder improvements: A possible qualification task is to implement ticket [https://trac.ffmpeg.org/ticket/1920 #1920]&lt;br /&gt;
** Add support for X96 extension (96khz)&lt;br /&gt;
** Add support for XLL extension (lossless)&lt;br /&gt;
** Add support for pure DTS-HD streams that do not contain a DTS core&lt;br /&gt;
** Add support for multiple assets&lt;br /&gt;
** Add support for LBR extension&lt;br /&gt;
&lt;br /&gt;
'''Prerequisites:''' C coding skills, basic familiarity with git.&lt;br /&gt;
&lt;br /&gt;
'''Mentor:''' Benjamin Larsson (''merbanan'' in #ffmpeg-devel on Freenode IRC)&lt;br /&gt;
&lt;br /&gt;
'''Backup Mentor:''' TBA, possibly [[User:Stefanosa|Stefano Sabatini]] (''saste'' in #ffmpeg-devel on Freenode IRC)&lt;br /&gt;
&lt;br /&gt;
== MXF Demuxer Improvements ==&lt;br /&gt;
&lt;br /&gt;
'''Description:''' FFmpeg's MXF demuxer needs a proper, compact way to map EssenceContainer? ULs to WrappingKind?. See ticket #2776. I wrote stuff in ticket #1916 which is also relevant.&lt;br /&gt;
&lt;br /&gt;
The gist of this is that essence in MXF is typically stored in one of two ways: as an audio/video interleave or with each stream in one huge chunk (like 1 GiB audio followed by 10 GiB video). Previous ways of telling these apart have been technically wrong, but has worked due to a lack of samples demonstrating the contrary.&lt;br /&gt;
&lt;br /&gt;
'''Expected results:''' The sample in ticket #2776 demuxes fine and there's a test case in FATE for it. The solution should grow libavformat by no more than 32 KiB.&lt;br /&gt;
&lt;br /&gt;
'''Prerequisites:''' C coding skills, basic familiarity with git.&lt;br /&gt;
&lt;br /&gt;
'''Qualification Task:''' Investigate if there may be a compact way of representing the UL -&amp;gt; WrappingKind? mapping specified in the official RP224 Excel document. The tables takes up about half a megabyte verbatim, which is unacceptable in a library as large as libavformat.&lt;br /&gt;
&lt;br /&gt;
'''Mentor:''' TBA, possibly Tomas Härdin (''thardin'' in #ffmpeg-devel on Freenode IRC)&lt;br /&gt;
&lt;br /&gt;
'''Backup Mentor:''' TBA&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br clear=&amp;quot;all&amp;quot;/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
= Unmentored Projects =&lt;br /&gt;
&lt;br /&gt;
This is a list of projects that students are encouraged to consider if a mentored project is unavailable or not within the students skill or interests. The student will have to find a mentor for the project. A student can also [[#Your_Own_Idea|propose their own project]].&lt;br /&gt;
&lt;br /&gt;
== glplay ==&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div class=&amp;quot;floatleft&amp;quot;&amp;gt;[[Image:Opengl_logo.jpg]]&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
'''Description:''' The SDL library that is used by FFplay has some deficiencies, adding OpenGL output to FFplay should allow for better performance (and less bugs at least for some hardware / driver combinations). This could be a new application (glplay), but it is probably simpler to extend ffplay to use OpenGL. You can use code from MPlayer's OpenGL vo module which may be relicensed under the LGPL.&lt;br /&gt;
&lt;br /&gt;
'''Mentor:''' TBD Backup: Reimar Döffinger&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br clear=&amp;quot;all&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== TrueHD encoder ==&lt;br /&gt;
&lt;br /&gt;
'''Description:''' FFmpeg currently does not support encoding to one of the lossless audio formats used on Bluray discs. This task consists of implementing a TrueHD encoder that allows to losslessly encode audio to play it on hardware devices capable of TrueHD decoding.&lt;br /&gt;
&lt;br /&gt;
== Opus decoder ==&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div class=&amp;quot;floatright&amp;quot;&amp;gt;[[Image:Opus.png]]&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
'''Description:''' Opus decoding is currently supported through the external libopus library&lt;br /&gt;
* Write a native decoder, continue working on the existing unfinished implementation&lt;br /&gt;
A possible qualification task is to port the existing incomplete decoder to current git head and improve it to show that you are capable of working on this task.&lt;br /&gt;
&lt;br /&gt;
== VC-1 interlaced ==&lt;br /&gt;
&lt;br /&gt;
'''Description:''' The FFmpeg VC-1 decoder has improved over the years, but many samples are still not decoded bit-exact and real-world interlaced streams typically show artefacts.&lt;br /&gt;
* Implement missing interlace features&lt;br /&gt;
* Make more reference samples bit-exact&lt;br /&gt;
As a qualification task, you should try to find a bug in the current decoder implementation and fix it.&lt;br /&gt;
&lt;br /&gt;
== JPEG 2000 ==&lt;br /&gt;
&lt;br /&gt;
&amp;lt;div class=&amp;quot;floatleft&amp;quot;&amp;gt;[[Image:Jpeg2000.jpg]]&amp;lt;/div&amp;gt;&lt;br /&gt;
&lt;br /&gt;
'''Description:''' FFmpeg contains an experimental native JPEG 2000 encoder and decoder. Both are missing many features, see also the FFmpeg bug tracker for some unsupported samples.&lt;br /&gt;
Work on an issue (for example from the bug tracker) as a qualification task to show that you are capable of improving the codec implementation.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br clear=&amp;quot;all&amp;quot;&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== VP8L ==&lt;br /&gt;
&lt;br /&gt;
'''Description:''' [[VP8L]] is a lossless format used in WebP. There is no support for this in FFmpeg.&lt;br /&gt;
&lt;br /&gt;
== Your Own Project Idea ==&lt;br /&gt;
&lt;br /&gt;
A student can propose a project. Ideas can also be found by browsing bugs and feature requests on our [https://trac.ffmpeg.org/ bug tracker]. The work should last the majority of the GSoC duration, the task must be approved by the developers, and a mentor must be assigned.&lt;br /&gt;
&lt;br /&gt;
Students can discuss an idea in the [http://ffmpeg.org/mailman/listinfo/ffmpeg-devel ffmpeg-devel mailing-list], the #ffmpeg-devel IRC channel, or contact the FFmpeg GSoC admins for more information.&lt;br /&gt;
&lt;br /&gt;
[[Category:FFmpeg]]&lt;/div&gt;</summary>
		<author><name>Reimar</name></author>
	</entry>
	<entry>
		<id>https://wiki.multimedia.cx/index.php?title=FFmtech_TODO&amp;diff=14622</id>
		<title>FFmtech TODO</title>
		<link rel="alternate" type="text/html" href="https://wiki.multimedia.cx/index.php?title=FFmtech_TODO&amp;diff=14622"/>
		<updated>2013-09-15T22:37:13Z</updated>

		<summary type="html">&lt;p&gt;Reimar: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;A list of (smallish?) tasks to do for the foundation, to pick whenever someone has time.&lt;br /&gt;
&lt;br /&gt;
* review web page&lt;br /&gt;
* upload the bylaws properly to the web page&lt;br /&gt;
* add slightly more detailed descriptions of the board members&lt;br /&gt;
* link to this page from some appropriate places?&lt;br /&gt;
* plan next board meeting, discuss topics on list&lt;br /&gt;
* add more tasks to this list&lt;/div&gt;</summary>
		<author><name>Reimar</name></author>
	</entry>
	<entry>
		<id>https://wiki.multimedia.cx/index.php?title=FFmtech_TODO&amp;diff=14621</id>
		<title>FFmtech TODO</title>
		<link rel="alternate" type="text/html" href="https://wiki.multimedia.cx/index.php?title=FFmtech_TODO&amp;diff=14621"/>
		<updated>2013-09-15T22:33:17Z</updated>

		<summary type="html">&lt;p&gt;Reimar: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;A list of (smallish?) tasks to do for the foundation, to pick whenever someone has time.&lt;br /&gt;
&lt;br /&gt;
* review web page&lt;br /&gt;
* upload the bylaws properly to the web page&lt;br /&gt;
* add slightly more detailed descriptions of the board members&lt;br /&gt;
* plan next board meeting, discuss topics on list&lt;br /&gt;
* add more tasks to this list&lt;/div&gt;</summary>
		<author><name>Reimar</name></author>
	</entry>
	<entry>
		<id>https://wiki.multimedia.cx/index.php?title=FFmtech_TODO&amp;diff=14620</id>
		<title>FFmtech TODO</title>
		<link rel="alternate" type="text/html" href="https://wiki.multimedia.cx/index.php?title=FFmtech_TODO&amp;diff=14620"/>
		<updated>2013-09-15T22:32:43Z</updated>

		<summary type="html">&lt;p&gt;Reimar: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;A list of (smallish?) tasks to do for the foundation, to pick whenever someone has time.&lt;br /&gt;
&lt;br /&gt;
* review web page&lt;br /&gt;
* upload the bylaws properly to the web page&lt;br /&gt;
* add slightly more detailed descriptions of the board members&lt;br /&gt;
* add more tasks to this list&lt;/div&gt;</summary>
		<author><name>Reimar</name></author>
	</entry>
	<entry>
		<id>https://wiki.multimedia.cx/index.php?title=USM&amp;diff=14619</id>
		<title>USM</title>
		<link rel="alternate" type="text/html" href="https://wiki.multimedia.cx/index.php?title=USM&amp;diff=14619"/>
		<updated>2013-09-03T19:18:55Z</updated>

		<summary type="html">&lt;p&gt;Reimar: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;USM Container format&lt;br /&gt;
&lt;br /&gt;
This format is used by e.g. &amp;quot;The Witcher 2&amp;quot;.&lt;br /&gt;
&lt;br /&gt;
It contains ordinary MPEG (probably MPEG-2) video, with the encoder string &amp;quot;TMPGEXE&amp;quot; being very obvious.&lt;br /&gt;
&lt;br /&gt;
The file consists of (in the header nested) chunks.&lt;br /&gt;
&lt;br /&gt;
Each main chunk is at least 64 byte aligned (though the design of the format should allow anything between including 0 and 64kB alignment if desired).&lt;br /&gt;
&lt;br /&gt;
The format of a chunk is&lt;br /&gt;
* 4 byte identifier string&lt;br /&gt;
* 4 byte length of data not including 8 bytes for identifier and length, big-endian&lt;br /&gt;
* 2 byte unknown (always 0 for @UTF, 0x18 otherwise?)&lt;br /&gt;
* 2 byte padding bytes&lt;br /&gt;
* 4 byte probably type 1 = header, 2 = metadata, 3 = index, 0 = data&lt;br /&gt;
* 4 byte timestamp&lt;br /&gt;
* 4 byte unknown&lt;br /&gt;
* 8 byte unknown (always 0, padding?)&lt;br /&gt;
&lt;br /&gt;
The known indentifier strings are:&lt;br /&gt;
* CRID Main header, usually 512 bytes long&lt;br /&gt;
* @UTF Used only as a sub-chunk, only 32 byte aligned. Header/meta data, consisting of first the (binary) real data, followed by a textual description of the fields.&lt;br /&gt;
* @SFV Video stream. If type is &amp;quot;data&amp;quot;, the actual data seems to start only at offset 0x60.&lt;br /&gt;
* @SFA Audio stream. If type is &amp;quot;data&amp;quot;, the actual data just like for metadata seems to start at offset 0x20.&lt;br /&gt;
&lt;br /&gt;
An index (called SEEKINFO) seems to only exist for video (it is placed into a @SFV tag), probably assuming the audio to be correct interleaved.&lt;br /&gt;
&lt;br /&gt;
[[Category:Container Formats]]&lt;/div&gt;</summary>
		<author><name>Reimar</name></author>
	</entry>
	<entry>
		<id>https://wiki.multimedia.cx/index.php?title=USM&amp;diff=14618</id>
		<title>USM</title>
		<link rel="alternate" type="text/html" href="https://wiki.multimedia.cx/index.php?title=USM&amp;diff=14618"/>
		<updated>2013-09-01T16:35:17Z</updated>

		<summary type="html">&lt;p&gt;Reimar: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;USM Container format&lt;br /&gt;
&lt;br /&gt;
This format is used by e.g. &amp;quot;The Witcher 2&amp;quot;.&lt;br /&gt;
&lt;br /&gt;
It contains ordinary MPEG (probably MPEG-2) video, with the encoder string &amp;quot;TMPGEXE&amp;quot; being very obvious.&lt;br /&gt;
&lt;br /&gt;
The file consists of (in the header nested) chunks.&lt;br /&gt;
&lt;br /&gt;
Each main chunk is at least 64 byte aligned (though the design of the format should allow anything between including 0 and 64kB alignment if desired).&lt;br /&gt;
&lt;br /&gt;
The format of a chunk is&lt;br /&gt;
* 4 byte identifier string&lt;br /&gt;
* 4 byte length of data not including 8 bytes for identifier and length, big-endian&lt;br /&gt;
* 2 byte unknown&lt;br /&gt;
* 2 byte padding bytes&lt;br /&gt;
* 4 byte probably type 1 = header, 2 = metadata, 3 = index, 0 = data&lt;br /&gt;
* 4 byte timestamp&lt;br /&gt;
* 4 byte unknown&lt;br /&gt;
* 8 byte unknown&lt;br /&gt;
&lt;br /&gt;
The known indentifier strings are:&lt;br /&gt;
* CRID Main header, usually 512 bytes long&lt;br /&gt;
* @UTF Used only as a sub-chunk, only 32 byte aligned. Header/meta data, consisting of first the (binary) real data, followed by a textual description of the fields.&lt;br /&gt;
* @SFV Video stream. If type is &amp;quot;data&amp;quot;, the actual data seems to start only at offset 0x60.&lt;br /&gt;
* @SFA Audio stream. If type is &amp;quot;data&amp;quot;, the actual data just like for metadata seems to start at offset 0x20.&lt;br /&gt;
&lt;br /&gt;
An index (called SEEKINFO) seems to only exist for video (it is placed into a @SFV tag), probably assuming the audio to be correct interleaved.&lt;br /&gt;
&lt;br /&gt;
[[Category:Container Formats]]&lt;/div&gt;</summary>
		<author><name>Reimar</name></author>
	</entry>
	<entry>
		<id>https://wiki.multimedia.cx/index.php?title=USM&amp;diff=14617</id>
		<title>USM</title>
		<link rel="alternate" type="text/html" href="https://wiki.multimedia.cx/index.php?title=USM&amp;diff=14617"/>
		<updated>2013-09-01T16:33:17Z</updated>

		<summary type="html">&lt;p&gt;Reimar: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;USM Container format&lt;br /&gt;
&lt;br /&gt;
This format is used by e.g. &amp;quot;The Witcher 2&amp;quot;.&lt;br /&gt;
&lt;br /&gt;
It contains ordinary MPEG (probably MPEG-2) video, with the encoder string &amp;quot;TMPGEXE&amp;quot; being very obvious.&lt;br /&gt;
&lt;br /&gt;
The file consists of (in the header nested) chunks.&lt;br /&gt;
&lt;br /&gt;
Each main chunk is at least 64 byte aligned (though the design of the format should allow anything between including 0 and 64kB alignment if desired).&lt;br /&gt;
&lt;br /&gt;
The format of a chunk is&lt;br /&gt;
* 4 byte identifier string&lt;br /&gt;
* 4 byte length, big-endian&lt;br /&gt;
* 2 byte unknown&lt;br /&gt;
* 2 byte padding bytes&lt;br /&gt;
* 4 byte probably type 1 = header, 2 = metadata, 3 = index, 0 = data&lt;br /&gt;
* 4 byte timestamp&lt;br /&gt;
* 4 byte unknown&lt;br /&gt;
* 8 byte unknown&lt;br /&gt;
&lt;br /&gt;
The known indentifier strings are:&lt;br /&gt;
* CRID main header&lt;br /&gt;
* @UTF Used only as a sub-chunk, only 32 byte aligned. Header/meta data, consisting of first the (binary) real data, followed by a textual description of the fields.&lt;br /&gt;
* @SFV Video stream. If type is &amp;quot;data&amp;quot;, the actual data seems to start only at offset 0x60.&lt;br /&gt;
* @SFA Audio stream. If type is &amp;quot;data&amp;quot;, the actual data just like for metadata seems to start at offset 0x20.&lt;br /&gt;
&lt;br /&gt;
An index (called SEEKINFO) seems to only exist for video (it is placed into a @SFV tag), probably assuming the audio to be correct interleaved.&lt;br /&gt;
&lt;br /&gt;
[[Category:Container Formats]]&lt;/div&gt;</summary>
		<author><name>Reimar</name></author>
	</entry>
	<entry>
		<id>https://wiki.multimedia.cx/index.php?title=USM&amp;diff=14616</id>
		<title>USM</title>
		<link rel="alternate" type="text/html" href="https://wiki.multimedia.cx/index.php?title=USM&amp;diff=14616"/>
		<updated>2013-09-01T16:32:08Z</updated>

		<summary type="html">&lt;p&gt;Reimar: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;USM Container format&lt;br /&gt;
&lt;br /&gt;
This format is used by e.g. &amp;quot;The Witcher 2&amp;quot;.&lt;br /&gt;
It contains ordinary MPEG (probably MPEG-2) video, with the encoder string &amp;quot;TMPGEXE&amp;quot; being very obvious.&lt;br /&gt;
The file consists of (in the header nested) chunks.&lt;br /&gt;
Each main chunk is at least 64 byte aligned (though the design of the format should allow anything between including 0 and 64kB alignment if desired).&lt;br /&gt;
The format of a chunk is&lt;br /&gt;
4 byte identifier string&lt;br /&gt;
4 byte length, big-endian&lt;br /&gt;
2 byte unknown&lt;br /&gt;
2 byte padding bytes&lt;br /&gt;
4 byte probably type 1 = header, 2 = metadata, 3 = index, 0 = data&lt;br /&gt;
4 byte timestamp&lt;br /&gt;
4 byte unknown&lt;br /&gt;
8 byte unknown&lt;br /&gt;
&lt;br /&gt;
The known indentifier strings are:&lt;br /&gt;
CRID main header&lt;br /&gt;
@UTF Used only as a sub-chunk, only 32 byte aligned. Header/meta data, consisting of first the (binary) real data, followed by a textual description of the fields.&lt;br /&gt;
@SFV Video stream. If type is &amp;quot;data&amp;quot;, the actual data seems to start only at offset 0x60.&lt;br /&gt;
@SFA Audio stream. If type is &amp;quot;data&amp;quot;, the actual data just like for metadata seems to start at offset 0x20.&lt;br /&gt;
&lt;br /&gt;
An index (called SEEKINFO) seems to only exist for video (it is placed into a @SFV tag), probably assuming the audio to be correct interleaved.&lt;br /&gt;
&lt;br /&gt;
[[Category:Container Formats]]&lt;/div&gt;</summary>
		<author><name>Reimar</name></author>
	</entry>
	<entry>
		<id>https://wiki.multimedia.cx/index.php?title=USM&amp;diff=14615</id>
		<title>USM</title>
		<link rel="alternate" type="text/html" href="https://wiki.multimedia.cx/index.php?title=USM&amp;diff=14615"/>
		<updated>2013-09-01T16:29:57Z</updated>

		<summary type="html">&lt;p&gt;Reimar: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;USM Container format&lt;br /&gt;
&lt;br /&gt;
This format is used by e.g. &amp;quot;The Witcher 2&amp;quot;.&lt;br /&gt;
It contains ordinary MPEG (probably MPEG-2) video, with the encoder string &amp;quot;TMPGEXE&amp;quot; being very obvious.&lt;br /&gt;
The file consists of (in the header nested) chunks.&lt;br /&gt;
Each main chunk is at least 64 byte aligned (though the design of the format should allow anything between including 0 and 64kB alignment if desired).&lt;br /&gt;
The format of a chunk is&lt;br /&gt;
4 byte identifier string&lt;br /&gt;
4 byte length, big-endian&lt;br /&gt;
2 byte unknown&lt;br /&gt;
2 byte padding bytes&lt;br /&gt;
4 byte probably type 1 = header, 2 = metadata, 3 = index, 0 = data&lt;br /&gt;
4 byte timestamp&lt;br /&gt;
4 byte unknown&lt;br /&gt;
8 byte unknown&lt;br /&gt;
&lt;br /&gt;
The known indentifier strings are:&lt;br /&gt;
CRID main header&lt;br /&gt;
@UTF Used only as a sub-chunk, only 32 byte aligned. Header/meta data, consisting of first the (binary) real data, followed by a textual description of the fields.&lt;br /&gt;
@SFV Video stream. If type is &amp;quot;data&amp;quot;, the actual data seems to start only at offset 0x60.&lt;br /&gt;
@SFA Audio stream. If type is &amp;quot;data&amp;quot;, the actual data just like for metadata seems to start at offset 0x20.&lt;br /&gt;
&lt;br /&gt;
An index (calles SEEKINFO) seems to only exist for video (it is placed into a @SFV tag), probably assuming the audio to be correct interleaved.&lt;/div&gt;</summary>
		<author><name>Reimar</name></author>
	</entry>
	<entry>
		<id>https://wiki.multimedia.cx/index.php?title=FFmpeg_Summer_of_Code_2013&amp;diff=14387</id>
		<title>FFmpeg Summer of Code 2013</title>
		<link rel="alternate" type="text/html" href="https://wiki.multimedia.cx/index.php?title=FFmpeg_Summer_of_Code_2013&amp;diff=14387"/>
		<updated>2013-03-06T19:57:14Z</updated>

		<summary type="html">&lt;p&gt;Reimar: /* glplay */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;FFmpeg Summer of Code Ideas List&lt;br /&gt;
&lt;br /&gt;
= Proposing Ideas =&lt;br /&gt;
&lt;br /&gt;
If you have a project idea please contact a developer first at the #ffmpeg-devel IRC channel on Freenode or via the [http://ffmpeg.org/contact.html ffmpeg-devel mailing list]. A good source of ideas is the [https://ffmpeg.org/trac/ffmpeg/ FFmpeg bug tracker] and [[FFmpeg_Summer_of_Code_2012|FFmpeg Summer of Code 2012 Ideas List]].&lt;br /&gt;
&lt;br /&gt;
When adding an idea follow this template for consistency:&lt;br /&gt;
&lt;br /&gt;
== Example Title ==&lt;br /&gt;
&lt;br /&gt;
'''Description:''' A few sentences or a short paragraph describing the task.&lt;br /&gt;
&lt;br /&gt;
'''Expected results:''' Bulleted list or paragraph describing what the student is expected to achieve.&lt;br /&gt;
&lt;br /&gt;
'''Prerequisites:''' Skills or knowledge required by student.&lt;br /&gt;
&lt;br /&gt;
'''Mentor:''' List mentor and backup mentor if there is one and contact info such as IRC name or email address.&lt;br /&gt;
&lt;br /&gt;
= GSoC task proposal ideas =&lt;br /&gt;
&lt;br /&gt;
== Audio codecs ==&lt;br /&gt;
&lt;br /&gt;
=== AAC ===&lt;br /&gt;
&lt;br /&gt;
'''Description:''' FFmpeg contains an AAC encoder and decoder, both of them can be improved in various ways. This is enough work for more than one GSoC project, so one part of your submission would be to define on which task exactly you want to work.&lt;br /&gt;
* AAC LD decoder&lt;br /&gt;
* AAC BSAC decoder: This has already been started, but the existing decoder still fails on many samples&lt;br /&gt;
* AAC SSR decoder&lt;br /&gt;
* AAC 960/120 MDCT window&lt;br /&gt;
* AAC multi-channel encoding&lt;br /&gt;
See also the FFmpeg bug tracker for AAC issues, fixing one of them or rebasing the existing incomplete BSAC decoder for current git head fixing one or more existing bugs are possible qualification tasks.&lt;br /&gt;
&lt;br /&gt;
=== DTS / DCA ===&lt;br /&gt;
&lt;br /&gt;
'''Description:''' FFmpeg contains a DTS decoder and an experimental DTS encoder. Both are missing some features.&lt;br /&gt;
* DTS-HD decoder improvements: A possible qualification task is to implement ticket #1920&lt;br /&gt;
** Add support for X96 extension (96khz)&lt;br /&gt;
** Add support for XLL extension (lossless)&lt;br /&gt;
** Add support for pure DTS-HD streams that do not contain a DTS core&lt;br /&gt;
** Add support for multiple assets&lt;br /&gt;
** Add support for LBR extension&lt;br /&gt;
* DTS encoder improvements&lt;br /&gt;
&lt;br /&gt;
=== MPEG-4 ALS encoder ===&lt;br /&gt;
&lt;br /&gt;
'''Description:''' An ALS decoder was implemented several years ago, an encoder is still missing.&lt;br /&gt;
&lt;br /&gt;
=== TrueHD encoder ===&lt;br /&gt;
&lt;br /&gt;
'''Description:''' FFmpeg currently does not support encoding to one of the lossless audio formats used on Bluray discs. This task consists of implementing a TrueHD encoder that allows to losslessly encode audio to play it on hardware devices capable of TrueHD decoding.&lt;br /&gt;
&lt;br /&gt;
=== Opus decoder ===&lt;br /&gt;
&lt;br /&gt;
'''Description:''' Opus decoding is currently supported through the external libopus library&lt;br /&gt;
* Write a native decoder, continue working on the existing unfinished implementation&lt;br /&gt;
A possible qualification task is to port the existing incomplete decoder to current git head and improve it to show that you are capable of working on this task.&lt;br /&gt;
&lt;br /&gt;
== Video codecs ==&lt;br /&gt;
&lt;br /&gt;
=== HEVC / H265 ===&lt;br /&gt;
&lt;br /&gt;
'''Description:''' The specification was finished, FFmpeg needs a decoder for this new format.&lt;br /&gt;
* Write a basic decoder supporting I, P, and, only if time permits, B slices.&lt;br /&gt;
* It does not need to be ASM/SIMD optimized but its high level structure must permit such optimizations to be easily added later.&lt;br /&gt;
As a qualification task you need to implement parsing headers and maybe a bit beyond that to demonstrate that you are qualified and understand the HEVC specification. This project requires a solid understanding of video coding and C, it's not something for the average SOC student.&lt;br /&gt;
&lt;br /&gt;
=== H264 MVC ===&lt;br /&gt;
&lt;br /&gt;
'''Description:''' MVC samples exist, the codec is used on Bluray media, FFmpeg is missing a decoder. Since this task also consists of some changes in the current architecture, it is especially important that this task is discussed on the ffmpeg-devel mailing list.&lt;br /&gt;
As qualification you have to do some work that demonstrates your understanding of MVC and that is a subpart of the whole MVC implementation.&lt;br /&gt;
&lt;br /&gt;
=== VC-1 interlaced ===&lt;br /&gt;
&lt;br /&gt;
'''Description:''' The FFmpeg VC-1 decoder has improved over the years, but many samples are still not decoded bit-exact and real-world interlaced streams typically show artefacts.&lt;br /&gt;
* Implement missing interlace features&lt;br /&gt;
* Make more reference samples bit-exact&lt;br /&gt;
As a qualification task, you should try to find a bug in the current decoder implementation and fix it.&lt;br /&gt;
&lt;br /&gt;
=== Animated Portable Network Graphics (APNG) ===&lt;br /&gt;
&lt;br /&gt;
'''Description:''' FFmpeg currently does not support Animated PNGs.&lt;br /&gt;
&lt;br /&gt;
'''Specification:''' https://wiki.mozilla.org/APNG_Specification&lt;br /&gt;
&lt;br /&gt;
'''Expected results:'''&lt;br /&gt;
* APNG demuxer&lt;br /&gt;
** implement robust probing:&lt;br /&gt;
*** PNG images are not misdetected as APNG animations&lt;br /&gt;
*** APNG animations are not misdetected as PNG images&lt;br /&gt;
** splits stream into sensible packets (so they can be easily reused in APNG muxer)&lt;br /&gt;
** survives fuzzing (zzuf)&lt;br /&gt;
** add FATE coverage, coverage should be at least 70%&lt;br /&gt;
** test code under valgrind so no invalid reads/writes happen&lt;br /&gt;
&lt;br /&gt;
* APNG decoder&lt;br /&gt;
** use existing PNG decoder code (write decoder in same file)&lt;br /&gt;
** implement parsing of all APNG chunks (acTL, fcTL, fdAT)&lt;br /&gt;
** error handling&lt;br /&gt;
** survives fuzzing (zzuf) &lt;br /&gt;
** add test for FATE, coverage should be at least 75%&lt;br /&gt;
** CRC checksum validation&lt;br /&gt;
** test code under valgrind so no invalid reads/writes happen&lt;br /&gt;
&lt;br /&gt;
* APNG muxer &amp;amp;&amp;amp; APNG encoder&lt;br /&gt;
** use existing PNG encoder code (write encoder in same file)&lt;br /&gt;
** write compliant files, make sure they play correctly in major web browsers that support APNG&lt;br /&gt;
** add test for FATE&lt;br /&gt;
&lt;br /&gt;
'''Prerequisites:''' C coding skills, familiarity with git/source code control systems.&lt;br /&gt;
&lt;br /&gt;
'''Qualification Task:''' Implement format autodetection for imagepipe &amp;amp; image demuxer&lt;br /&gt;
&lt;br /&gt;
'''Mentor: [[User:Pbm|Paul B Mahol]]'''&lt;br /&gt;
&lt;br /&gt;
=== JPEG 2000 ===&lt;br /&gt;
&lt;br /&gt;
'''Description:''' FFmpeg contains an experimental native JPEG 2000 encoder and decoder. Both are missing many features, see also the FFmpeg bug tracker for some unsupported samples.&lt;br /&gt;
Work on an issue (for example from the bug tracker) as a qualification task to show that you are capable of improving the codec implementation.&lt;br /&gt;
&lt;br /&gt;
=== VP7 ===&lt;br /&gt;
&lt;br /&gt;
'''Description:''' Not many VP7 samples are in the wild, but no open-source decoder exists although a specification exists. Write a decoder that reuses as much as possible of existing FFmpeg code, it is likely that functions of the existing decoders for On2-based formats will be useful.&lt;br /&gt;
&lt;br /&gt;
== Other ==&lt;br /&gt;
&lt;br /&gt;
=== glplay ===&lt;br /&gt;
&lt;br /&gt;
'''Description:''' The SDL library that is used by FFplay has some deficiencies, adding OpenGL output to FFplay should allow for better performance (and less bugs at least for some hardware / driver combinations). This could be a new application (glplay), but it is probably simpler to extend ffplay to use OpenGL. You can use code from MPlayer's OpenGL vo module which may be relicensed under the LGPL.&lt;br /&gt;
&lt;br /&gt;
'''Mentor:''' TBD Backup: Reimar Döffinger&lt;br /&gt;
&lt;br /&gt;
=== Misc Libavfilter extension ===&lt;br /&gt;
&lt;br /&gt;
'''Description:''' Libavfilter is the FFmpeg filtering library. It currently supports audio and video filtering and generation support. This work may focus on porting, fixing, extending, or writing new audio and video filters from scratch. &lt;br /&gt;
&lt;br /&gt;
Candidate filters for porting may be the remaining MPlayer filters currently supported through the mp wrapper, libaf MPlayer filters, and filters from other frameworks (e.g. mjpegtools, transcode, avisynth, virtualdub, etc.). In case of mp ports, the student should verify that the new filter produces the same output and is not slower.&lt;br /&gt;
&lt;br /&gt;
Some ideas for more filters:&lt;br /&gt;
* a frequency filtering domain filter relying on the FFT utils in libavcodec&lt;br /&gt;
* a controller filter which allows to send commands to other filters (e.g. to adjust volume, contrast, etc.), e.g. like the sendcmd filter but through an interactive GUI&lt;br /&gt;
* a lua scripting filter, which allows to implement filtering custom logic in lua&lt;br /&gt;
&lt;br /&gt;
For more ideas check:&lt;br /&gt;
[https://ffmpeg.org/trac/ffmpeg/query?status=new&amp;amp;status=open&amp;amp;status=reopened&amp;amp;component=avfilter&amp;amp;col=id&amp;amp;col=summary&amp;amp;col=status&amp;amp;col=type&amp;amp;col=priority&amp;amp;col=component&amp;amp;col=version&amp;amp;order=priority trac libavfilter tickets].&lt;br /&gt;
&lt;br /&gt;
'''Expected results:''' Write or port audio and video filters and possibly fix/extend libavfilter API and design when required.&lt;br /&gt;
&lt;br /&gt;
'''Prerequisites:''' C coding skills, familiarity with git/source code control systems. Some background on DSP and image/sound processing techniques would be a bonus but is not strictly required.&lt;br /&gt;
&lt;br /&gt;
'''Qualification task:''' write or port one or more filter&lt;br /&gt;
&lt;br /&gt;
'''Mentor:''' Stefano Sabatini (''saste'' on IRC). Backup: Clément Bœsch (''ubitux'' on IRC).&lt;br /&gt;
&lt;br /&gt;
=== Subtitles ===&lt;br /&gt;
&lt;br /&gt;
'''Description:''' FFmpeg has been working on improving its subtitles support recently, notably by adding the support for various text subtitles and various hardsubbing (burning the subtitles onto the video) facilities. While the theme may sound relatively simple compared to audio/video signal processing, the project carry an historical burden not easy to deal with, and introduces various issues very specific to its sparse form.&lt;br /&gt;
&lt;br /&gt;
'''Expected results:'''&lt;br /&gt;
* Write the support for new subtitles formats. Example: a demuxer for .SUP files, just like VobSub but for Blu-Ray.&lt;br /&gt;
* Improve text subtitles decoders. Typically, this can be supporting advanced markup features in SAMI or WebVTT.&lt;br /&gt;
* Improve the API to facilitates a proper integration of subtitles into libavfilter.&lt;br /&gt;
&lt;br /&gt;
'''Prerequisites:''' C coding skills, familiarity with git/source code control systems. Some background in fansubbing area (notably ASS experience) would be a bonus but is not strictly required.&lt;br /&gt;
&lt;br /&gt;
'''Qualification task:''' write one subtitles demuxer or muxer, and do some improvements to the different layers of the subtitles (to make sure the whole chain is understood).&lt;br /&gt;
&lt;br /&gt;
'''Mentor:''' Clément Bœsch (''ubitux'' on IRC).&lt;br /&gt;
&lt;br /&gt;
[[Category:FFmpeg]]&lt;/div&gt;</summary>
		<author><name>Reimar</name></author>
	</entry>
	<entry>
		<id>https://wiki.multimedia.cx/index.php?title=Small_FFmpeg_Tasks&amp;diff=14045</id>
		<title>Small FFmpeg Tasks</title>
		<link rel="alternate" type="text/html" href="https://wiki.multimedia.cx/index.php?title=Small_FFmpeg_Tasks&amp;diff=14045"/>
		<updated>2012-05-08T18:39:56Z</updated>

		<summary type="html">&lt;p&gt;Reimar: /* Programming Tasks */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This page contains ideas for small, relatively simple tasks for the [[FFmpeg]] project. People who might be interested in trying one of these tasks:&lt;br /&gt;
* Someone who wants to contribute to FFmpeg and needs to find a well-defined task to start with&lt;br /&gt;
* Someone who wishes to qualify for one of FFmpeg's coveted [[FFmpeg Summer Of Code|Summer of Code]] project slots&lt;br /&gt;
* An existing FFmpeg developer who has been away from the project for a while and needs a smaller task as motivation for re-learning the codebase&lt;br /&gt;
&lt;br /&gt;
For other tasks of varying difficulty, see the [[Interesting Patches]] page.&lt;br /&gt;
&lt;br /&gt;
'''If you would like to work on one of these tasks''', please take these steps:&lt;br /&gt;
* Subscribe to the [https://lists.ffmpeg.org/mailman/listinfo/ffmpeg-devel FFmpeg development mailing list] and indicate your interest&lt;br /&gt;
* Ask [[User:Multimedia Mike|Multimedia Mike]] for a Wiki account so you can claim your task on this Wiki&lt;br /&gt;
&lt;br /&gt;
'''If you would like to add to this list''', please be prepared to explain some useful details about the task. Excessively vague tasks with no supporting details will be ruthlessly deleted.&lt;br /&gt;
&lt;br /&gt;
== Programming Tasks ==&lt;br /&gt;
&lt;br /&gt;
=== Finish up a previous incomplete SoC project ===&lt;br /&gt;
&lt;br /&gt;
Several SoC projects from previous years have not yet made it into FFmpeg. Taking any of them and finishing them up to the point that they can be included should make for a good qualification task. Check out the [[FFmpeg Summer Of Code]] overview page and look for the unfinished projects, like the TS muxer.&lt;br /&gt;
&lt;br /&gt;
=== Add code to validate get_buffer usage of decoders ===&lt;br /&gt;
Change the default_get_buffer etc. functions to enforce the minimum guarantees the decoder requests.&lt;br /&gt;
E.g. if a decoder does not set FF_BUFFER_HINTS_READABLE, return a buffer without read permissions (using e.g. mprotect).&lt;br /&gt;
If the decoder does not use reget_buffer, always return a buffer initialized with random data.&lt;br /&gt;
If the decoder does not set FF_BUFFER_HINTS_PRESERVE, always destroy the buffer contents as soon as possible.&lt;br /&gt;
Make reget_buffer always fail if FF_BUFFER_HINTS_REUSABLE was not used.&lt;br /&gt;
Probably more things that could be done.&lt;br /&gt;
&lt;br /&gt;
=== Generic Colorspace system ===&lt;br /&gt;
This task involves adding support more than 8 bits per component (Y on 10 bits, U on 10 bits, V on 10 bits for example)&lt;br /&gt;
and generic simple conversion to other colorspaces.&lt;br /&gt;
&lt;br /&gt;
''Does this have to do with revising FFmpeg's infrastructure? If so, then it doesn't feel like a qualification task. If it's something simpler, then the vague description does not convey that simplicity. Please expound.'' --[[User:Multimedia Mike|Multimedia Mike]] 12:56, 25 February 2008 (EST)&lt;br /&gt;
&lt;br /&gt;
''I don't think so, extending PixFmt to extended structure with finegrained description like depth, range values, colorspace, sample period, and write generic simple conversion from all formats to all others, like suggested by Michael on the mailing list. Conversion routine can be a good qualification task for video encoders/decoders. What do you think ?&lt;br /&gt;
--[[User:Bcoudurier|Baptiste Coudurier]] 00:30, 29 February 2008 (EST)&lt;br /&gt;
&lt;br /&gt;
''* Adding the [[YCoCg]] colorspace (with different sized planes) for RGB sourced pictures would be nice too. [[User:Elte|Elte]] 07:15, 16 March 2009 (EDT)&lt;br /&gt;
&lt;br /&gt;
=== Extend GIF Encoder and Decoder to support Animated GIFs ===&lt;br /&gt;
&lt;br /&gt;
=== Implement a Vivo demuxer ===&lt;br /&gt;
Implement a demuxer for the [[Vivo]] file format. The best reference for understanding the format would be MPlayer's [http://svn.mplayerhq.hu/mplayer/trunk/libmpdemux/demux_viv.c?view=markup existing .viv demuxer].&lt;br /&gt;
&lt;br /&gt;
This task corresponds to ticket 132: https://avcodec.org/trac/ffmpeg/ticket/132&lt;br /&gt;
&lt;br /&gt;
''I am ready to help out with understanding MPlayer's demuxer, esp. MPlayer API stuff if necessary.&lt;br /&gt;
--[[User:Reimar|Reimar]] 15:46, 1 March 2008 (EST)&lt;br /&gt;
&lt;br /&gt;
=== Port missing demuxers from MPlayer ===&lt;br /&gt;
MPlayer supports a few container formats in libmpdemux that are not yet present in libavformat. Porting them over and gettting them relicensed as LGPL or reimplementing them from scratch should make reasonable small tasks.&lt;br /&gt;
&lt;br /&gt;
# TiVo --&lt;br /&gt;
# VIVO -- ''Daniel Verkamp has a patch for this''&lt;br /&gt;
# SL support for MPEG-TS (anyone got samples?)&lt;br /&gt;
# MNG -- ''Paul B Mahol is working on this''&lt;br /&gt;
&lt;br /&gt;
=== Optimal Huffman tables for (M)JPEG ===&lt;br /&gt;
This task is outlined at http://guru.multimedia.cx/small-tasks-for-ffmpeg/ and is tracked in the issue tracker: http://roundup.libav.org/issue267&lt;br /&gt;
&lt;br /&gt;
=== M95 Playback System ===&lt;br /&gt;
This task is to implement a playback subsystem for [[M95]] files. This will entail writing a new file demuxer and video decoder (the audio is already uncompressed), both of which should be fairly easy by FFmpeg standards. [[M95|The M95 page]] contains the specs necessary to complete this task and points to downloadable samples.&lt;br /&gt;
&lt;br /&gt;
=== BRP Playback System ===&lt;br /&gt;
This task is to implement a playback subsystem for [[BRP]] files. This will entail writing a new file demuxer as well as a video decoder that can handle at least 2 variations of format data. Further, write an audio decoder for the custom DPCM format in the file. All of these tasks are considered fairly easy by FFmpeg standards. [[BRP|The BRP page]] contains the specs necessary to complete this task and points to downloadable samples for both known variations.&lt;br /&gt;
&lt;br /&gt;
=== 16-bit VQA Video Decoder ===&lt;br /&gt;
Westwood [[VQA]] files are already supported. However, there are three variations of its custom video codec, only the first two of which are supported. This task involves implementing support for the third variation. Visit the VQA samples repository: http://samples.multimedia.cx/game-formats/vqa/ -- The files in the directories Tiberian Sun VQAs/, bladerunner/, and dune2000/ use the 3rd variation of this codec. The [[VQA|VQA page]] should link to all the details you need to support this format.&lt;br /&gt;
&lt;br /&gt;
Discussion/patch:&lt;br /&gt;
http://thread.gmane.org/gmane.comp.video.ffmpeg.devel/89902/focus=90433&lt;br /&gt;
&lt;br /&gt;
=== HNM4 Playback System ===&lt;br /&gt;
This task is to implement a playback subsystem for [[HNM4]] variant of the [[HNM]] format. This will entail writing a new file demuxer and video decoder, both of which are considered fairly easy by FFmpeg standards. [[HNM4|The HNM4 page]] contains the specs necessary to complete this task and links to downloadable samples.&lt;br /&gt;
&lt;br /&gt;
=== Apple RPZA encoder ===&lt;br /&gt;
A patch was once sent to the ffmpeg-devel mailing list to include an encoder for the [[Apple RPZA]] video codec. That code can be found on the &amp;quot;[[Interesting Patches]]&amp;quot; page. This qualification task involves applying that patch so that it can compile with current HEAD and then cleaning it up per the standards of the project. Engage the mailing list to learn more about what to do.&lt;br /&gt;
&lt;br /&gt;
=== QuickTime Edit List Support ===&lt;br /&gt;
Implement edit list support in the QuickTime demuxer (libavformat/mov.c). This involves parsing the 'elst' atom in a QuickTime file. For a demonstration of how this is a problem, download the file menace00.mov from http://samples.mplayerhq.hu/mov/editlist/ and play it with ffplay or transcode it with ffmpeg. Notice that the audio and video are ever so slightly out of sync. Proper edit list support will solve that. Other samples in that directory also presumably exhibit edit list-related bugs. The [http://xine.cvs.sourceforge.net/xine/xine-lib/src/demuxers/demux_qt.c?view=markup Xine demuxer] has support for this, it might be useful for hints.&lt;br /&gt;
&lt;br /&gt;
(patch was submitted to ffmpeg-devel , around 14 March 2009) &lt;br /&gt;
&lt;br /&gt;
=== Add wma fixed point decoder back into libavcodec ===&lt;br /&gt;
http://svn.rockbox.org/viewvc.cgi/trunk/apps/codecs/libwma/&lt;br /&gt;
Rockbox's fixed-point WMA decoder was adapted from the decoder in libavcodec.&lt;br /&gt;
&lt;br /&gt;
=== VC1 timestamps in m2ts ===&lt;br /&gt;
&lt;br /&gt;
Codec copy of VC1 from m2ts currently doesn't work. Either extend the VC1 parser to output/fix timestamps, or fix the timestamps from m2ts demuxing.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== flip flag for upside-down codecs ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;about the flip, a patch that decodes images fliped when&lt;br /&gt;
codec_tag == ff_get_fourcc(&amp;quot;GEOX&amp;quot;) is welcome.&lt;br /&gt;
its a metter of 2lines manipulating data/linesize of imgages after&lt;br /&gt;
get_buffer() or something similar&lt;br /&gt;
[...]&lt;br /&gt;
-- &lt;br /&gt;
Michael     GnuPG fingerprint: 9FF2128B147EF6730BADF133611EC787040B0FAB&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
more info:&lt;br /&gt;
http://roundup.ffmpeg.org/roundup/ffmpeg/issue741&lt;br /&gt;
&lt;br /&gt;
=== lavf-based concatenation tool ===&lt;br /&gt;
&lt;br /&gt;
It would be nice to have some libavformat-based tool that would extract frames from multiple files (possible different containers as well) and put them into single one.&lt;br /&gt;
&lt;br /&gt;
=== vcr1 encoder ===&lt;br /&gt;
According to this: http://lists.mplayerhq.hu/pipermail/ffmpeg-devel/2009-February/063555.html vcr1 encoder is disabled, and won't compile if enabled.  Michael would prefer to keep it around, and have someone grow it into full encoder.&lt;br /&gt;
&lt;br /&gt;
=== implement some colorspace fourcc/codecs ===&lt;br /&gt;
some colorspace formats were uploaded to http://samples.mplayerhq.hu/V-codecs/&lt;br /&gt;
including:&lt;br /&gt;
 CYUV.AVI is 8 Bit Interleaved 4:2:2&lt;br /&gt;
 a12v.avi is 4:2:2:4 10 Bit Interleaved&lt;br /&gt;
 auv2.avi is 4:2:2:4 8 Bit Interleaved&lt;br /&gt;
 and V-codecs/yuv8/MAILTEST.AVI .&lt;br /&gt;
&lt;br /&gt;
it might decode with current pixfmts, a sample commit is 9853bbb21a19d540850de60d3e9cf7c6ef9da7dc&lt;br /&gt;
&lt;br /&gt;
a sample commit for adding new input formats to swscale is 4884b9e50d416f84e64bfaf546a03e490cb83a2f&lt;br /&gt;
 the hunks 3 and 5 you do not need, they are optional special converters&lt;br /&gt;
 also the change to isSupportedOut() you do not need&lt;br /&gt;
 above will add a new input format&lt;br /&gt;
&lt;br /&gt;
another example for adding an input format is a43fb6b37efa5b01f2c9bdc414570691229bcfab&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Implement Phantom Cine demuxer and Bayer format support for swscale ===&lt;br /&gt;
The format is described here:&lt;br /&gt;
http://wiki.multimedia.cx/index.php?title=Phantom_Cine&lt;br /&gt;
It will need support for Bayer -&amp;gt; RGB conversion in swscale to make the demuxer useful though.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== support for [[YCoCg]]/RGB colorspace in FFV1 ===&lt;br /&gt;
Add support for [[YCoCg]] and [[RGB]] encoded sources for the [[FFV1]] codec&lt;br /&gt;
&lt;br /&gt;
This would add a free lossless intra-frame RGB codec for all supported platforms (most important OS X + Windows) which is often asked for video editing in video forums (e.g. slashcam.de)&lt;br /&gt;
&lt;br /&gt;
=== [[IFF#ANIM|IFF ANIM]] decoder ===&lt;br /&gt;
Modify libavformat/iff.c to handle this chunk and write a decoder for the format. The wiki page at [[IFF#ANIM|IFF ANIM]] has links to more information and source code. Samples can be found at http://www-user.tu-chemnitz.de/~womar/projects/iffanim/iffanim_samplepack.zip .&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== port missing decoders/demuxers from other open source projects. ===&lt;br /&gt;
&lt;br /&gt;
http://www.mega-nerd.com/libsndfile/#Features&lt;br /&gt;
 Paris Audio File PAF&lt;br /&gt;
 IRCAM SF&lt;br /&gt;
 GNU Octave 2.0 MAT4&lt;br /&gt;
 GNU Octave 2.1 MAT5&lt;br /&gt;
 Portable Voice Format PVFSound&lt;br /&gt;
 Designer II SD2&lt;br /&gt;
samples are here: http://www.mega-nerd.com/tmp/SoundFileCollection-20050711-0902.tgz&lt;br /&gt;
&lt;br /&gt;
http://www.hawksoft.com/hawkvoice/&lt;br /&gt;
 HVDI_VOICE_DATA- packet&lt;br /&gt;
 [[GSM]]&lt;br /&gt;
 LPC&lt;br /&gt;
 CELP&lt;br /&gt;
 LPC10&lt;br /&gt;
&lt;br /&gt;
http://sourceforge.net/projects/vgmstream&lt;br /&gt;
 150+ formats: http://vgmstream.svn.sourceforge.net/viewvc/vgmstream/readme.txt&lt;br /&gt;
&lt;br /&gt;
http://www.imagemagick.org&lt;br /&gt;
http://www.graphicsmagick.org/formats.html&lt;br /&gt;
 many image formats not supported yet.&lt;br /&gt;
&lt;br /&gt;
http://gpac.sourceforge.net/&lt;br /&gt;
 [[MPEG-4 BIFS]]&lt;br /&gt;
 3GPP DIMS&lt;br /&gt;
 [[LASeR]]&lt;br /&gt;
 SAF&lt;br /&gt;
 SVG&lt;br /&gt;
 [[Synchronized Multimedia Integration Language|SMIL]]&lt;br /&gt;
 VRML&lt;br /&gt;
 X3D&lt;br /&gt;
 XMT&lt;br /&gt;
&lt;br /&gt;
http://adplug.sourceforge.net/&lt;br /&gt;
http://adplug.sourceforge.net/library/&lt;br /&gt;
 many OPL2/OPL3 audio formats not supported yet.&lt;br /&gt;
&lt;br /&gt;
http://mikmod.raphnet.net/&lt;br /&gt;
http://mikmod.raphnet.net/#features&lt;br /&gt;
 many music pattern formats not supported yet.&lt;br /&gt;
&lt;br /&gt;
http://www.fly.net/~ant/libs/audio.html#Game_Music_Emu&lt;br /&gt;
 AY&lt;br /&gt;
 GBS&lt;br /&gt;
 GYM&lt;br /&gt;
 HES&lt;br /&gt;
 KSS&lt;br /&gt;
 NSF, NSFE&lt;br /&gt;
 SAP&lt;br /&gt;
 [[SNES-SPC700 Sound Format]]&lt;br /&gt;
 VGM, VGZ&lt;br /&gt;
&lt;br /&gt;
=== libswscale PAL8 output ===&lt;br /&gt;
&lt;br /&gt;
See the thread: &amp;quot;[RFC] libswscale palette output implementation&amp;quot;:&lt;br /&gt;
http://thread.gmane.org/gmane.comp.video.ffmpeg.devel/101397&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== vloopback output support ===&lt;br /&gt;
&lt;br /&gt;
vloopback is a linux kernel device which allows to create a virtual video device where&lt;br /&gt;
programs can write, and can be accessed as a normal video device:&lt;br /&gt;
http://www.lavrsen.dk/twiki/bin/view/Motion/VideoFourLinuxLoopbackDevice&lt;br /&gt;
&lt;br /&gt;
This would allow to write the ffmpeg output to a vloopdevice and be displayed by some a&lt;br /&gt;
program reading from such device (e.g. skype, a voip client etc.).&lt;br /&gt;
&lt;br /&gt;
An example of a program which uses vloopback:&lt;br /&gt;
http://www.ws4gl.org/&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Port video filters from MPlayer/VLC/Mjpegtools/Effectv/etc etc to libavfilter ===&lt;br /&gt;
&lt;br /&gt;
There are plenty programs providing their own filters, many of them may be easily ported to the &lt;br /&gt;
superior ;-) framework of libavfilter. Also may be possible to create wrappers around other libraries&lt;br /&gt;
(e.g. opencv, libgimp, libshowphoto, libaa).&lt;br /&gt;
&lt;br /&gt;
=== rar/zip/gz/bz2 etc demuxer ===&lt;br /&gt;
there are still compressed files out there (zipped raw wav, multi-rar'ed videos etc). create a decompression demuxer for them.&lt;br /&gt;
&lt;br /&gt;
=== Less common AAC decoder features ===&lt;br /&gt;
&lt;br /&gt;
Add support to the AAC decoder for object type ER AAC LC or AAC LC 960.&lt;br /&gt;
&lt;br /&gt;
=== arithmetic decoding (and encoding) for mjpeg ===&lt;br /&gt;
Following marker codes are not supported by our mjpeg decoder:&lt;br /&gt;
DAC, SOF9, SOF10, SOF11, SOF13, SOF14 and SOF15.&lt;br /&gt;
&lt;br /&gt;
=== [[Enhanced Variable Rate Codec]] decoder ===&lt;br /&gt;
samples and specs available.&lt;br /&gt;
&lt;br /&gt;
=== adobe http f4f segmented fragmentation dynamic streaming format ===&lt;br /&gt;
sample streams on http://www.fox.com . command line instructions for creating such files: http://help.adobe.com/en_US/HTTPStreaming/1.0/Using/WS9463dbe8dbe45c4c-c126f3b1260533756d-7ffc.html . spec is available under adobe NDA. not to be confused with freely available F4V specification. open source php to convert f4f to flv: https://github.com/svnpenn/dotfiles/blob/master/etc/AdobeHDS.php&lt;br /&gt;
&lt;br /&gt;
=== get 3IV1 decoder working and benchmark ===&lt;br /&gt;
we have a decoder for 3IV1. its currently if 0'd in the mpeg4 decoder. your task is to test if it still builds and works, and fix it so that it does not slow down mpeg4 decoder if enabled. the end goal is to enable it by default.&lt;br /&gt;
&lt;br /&gt;
== Reverse Engineering Tasks ==&lt;br /&gt;
&lt;br /&gt;
=== Demuxer for csf format and video codec ===&lt;br /&gt;
This is partially analyzed in http://ffmpeg.org/trac/ffmpeg/ticket/1060&lt;br /&gt;
&lt;br /&gt;
=== realplayer's ivr format and create a demuxer for it ===&lt;br /&gt;
samples on [[IVR]] page.&lt;br /&gt;
&lt;br /&gt;
=== pick a random binary codec from mplayer ===&lt;br /&gt;
[[MPlayer]] has over 100 binary codecs which have no opensource decoder. pick one, find a sample and try to reverse engineer it. note that some work has been done on some codecs, and its a good idea to ask on the mailing list before starting.&lt;br /&gt;
&lt;br /&gt;
=== emblaze demuxer/decoder from java code ===&lt;br /&gt;
samples and java decoder: http://samples.mplayerhq.hu/internets/emblaze/&lt;br /&gt;
&lt;br /&gt;
== Non-Programming Tasks ==&lt;br /&gt;
&lt;br /&gt;
=== Check Linux distributions for patches to ffmpeg ===&lt;br /&gt;
&lt;br /&gt;
check various distros like Fedora, Ubuntu, Debian, Mint, Arch, Suse etc for patches to ffmpeg. write down location of patches so it can be checked on an annual basis. if patches are found , report to ffmpeg-devel mailing list or bug trac.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== improve layout and accessability of ffmpeg website ===&lt;br /&gt;
&lt;br /&gt;
test ffmpeg.org with various browsers, including screen readers and get it optimized and available to people with poor vision. check wording and general ease of use. for example putting large download links for users like vlc and firefox have.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== restore and update compatability page on ffmpeg website ===&lt;br /&gt;
&lt;br /&gt;
we used to have a page that detailed how to create files for other software players and operating systems. restore this page from git history and update it for new devices and standardized codecs (h264 is the preferred codec now).&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== contact large ffmpeg users for broken / unplayable samples ===&lt;br /&gt;
contact the largest users of ffmpeg, like youtube, facebook, archive.org, blip.tv and others and ask them for access to samples that do not decode correctly.&lt;br /&gt;
&lt;br /&gt;
=== review sample request error messages ===&lt;br /&gt;
ffmpeg has an av_log_ask_for_sample generic log message to ask the user for a sample when there is a problem. your task is to review ffmpeg decoders and demuxers (and possibly other inputs) and replace regular av_log messages requesting samples with it. example commit here: http://ffmpeg.org/pipermail/ffmpeg-cvslog/2011-April/036509.html&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== write bluray and 3d howtos ===&lt;br /&gt;
write a document or wiki article or just link to some info on how to play + encode + rip bluray using ffmpeg/mplayer and the various bluray libs required. also a guide on how to use ffmpeg/mplayer/vlc to encode and play various 3D formats. including maybe some supported hardware screens/video cards w/ examples. includes updating this wiki page [[Blu Ray and HD-DVD Playback Status]]&lt;br /&gt;
[[How to make a 3d movie with ffmpeg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== write fate breakage notification bot ===&lt;br /&gt;
script up a bot to scan http://fate.ffmpeg.org that tells #ffmpeg-devel when something breaks compilation in fate.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== write ipad/iphone/ios howto ===&lt;br /&gt;
write up some documentation on how to compile ffmpeg/ffplay/ffserver for iOS. exact tool versions, command lines, library requirements. both native compilation on the device and cross-compile using OS X. an android howto for various devices would be useful too.&lt;br /&gt;
&lt;br /&gt;
[[Category:FFmpeg]]&lt;br /&gt;
[[Category:Libav]]&lt;/div&gt;</summary>
		<author><name>Reimar</name></author>
	</entry>
	<entry>
		<id>https://wiki.multimedia.cx/index.php?title=Small_FFmpeg_Tasks&amp;diff=14043</id>
		<title>Small FFmpeg Tasks</title>
		<link rel="alternate" type="text/html" href="https://wiki.multimedia.cx/index.php?title=Small_FFmpeg_Tasks&amp;diff=14043"/>
		<updated>2012-04-29T12:36:43Z</updated>

		<summary type="html">&lt;p&gt;Reimar: /* Reverse Engineering Tasks */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This page contains ideas for small, relatively simple tasks for the [[FFmpeg]] project. People who might be interested in trying one of these tasks:&lt;br /&gt;
* Someone who wants to contribute to FFmpeg and needs to find a well-defined task to start with&lt;br /&gt;
* Someone who wishes to qualify for one of FFmpeg's coveted [[FFmpeg Summer Of Code|Summer of Code]] project slots&lt;br /&gt;
* An existing FFmpeg developer who has been away from the project for a while and needs a smaller task as motivation for re-learning the codebase&lt;br /&gt;
&lt;br /&gt;
For other tasks of varying difficulty, see the [[Interesting Patches]] page.&lt;br /&gt;
&lt;br /&gt;
'''If you would like to work on one of these tasks''', please take these steps:&lt;br /&gt;
* Subscribe to the [https://lists.ffmpeg.org/mailman/listinfo/ffmpeg-devel FFmpeg development mailing list] and indicate your interest&lt;br /&gt;
* Ask [[User:Multimedia Mike|Multimedia Mike]] for a Wiki account so you can claim your task on this Wiki&lt;br /&gt;
&lt;br /&gt;
'''If you would like to add to this list''', please be prepared to explain some useful details about the task. Excessively vague tasks with no supporting details will be ruthlessly deleted.&lt;br /&gt;
&lt;br /&gt;
== Programming Tasks ==&lt;br /&gt;
&lt;br /&gt;
=== Finish up a previous incomplete SoC project ===&lt;br /&gt;
&lt;br /&gt;
Several SoC projects from previous years have not yet made it into FFmpeg. Taking any of them and finishing them up to the point that they can be included should make for a good qualification task. Check out the [[FFmpeg Summer Of Code]] overview page and look for the unfinished projects, like the TS muxer.&lt;br /&gt;
&lt;br /&gt;
=== Generic Colorspace system ===&lt;br /&gt;
This task involves adding support more than 8 bits per component (Y on 10 bits, U on 10 bits, V on 10 bits for example)&lt;br /&gt;
and generic simple conversion to other colorspaces.&lt;br /&gt;
&lt;br /&gt;
''Does this have to do with revising FFmpeg's infrastructure? If so, then it doesn't feel like a qualification task. If it's something simpler, then the vague description does not convey that simplicity. Please expound.'' --[[User:Multimedia Mike|Multimedia Mike]] 12:56, 25 February 2008 (EST)&lt;br /&gt;
&lt;br /&gt;
''I don't think so, extending PixFmt to extended structure with finegrained description like depth, range values, colorspace, sample period, and write generic simple conversion from all formats to all others, like suggested by Michael on the mailing list. Conversion routine can be a good qualification task for video encoders/decoders. What do you think ?&lt;br /&gt;
--[[User:Bcoudurier|Baptiste Coudurier]] 00:30, 29 February 2008 (EST)&lt;br /&gt;
&lt;br /&gt;
''* Adding the [[YCoCg]] colorspace (with different sized planes) for RGB sourced pictures would be nice too. [[User:Elte|Elte]] 07:15, 16 March 2009 (EDT)&lt;br /&gt;
&lt;br /&gt;
=== Extend GIF Encoder and Decoder to support Animated GIFs ===&lt;br /&gt;
&lt;br /&gt;
=== Implement a Vivo demuxer ===&lt;br /&gt;
Implement a demuxer for the [[Vivo]] file format. The best reference for understanding the format would be MPlayer's [http://svn.mplayerhq.hu/mplayer/trunk/libmpdemux/demux_viv.c?view=markup existing .viv demuxer].&lt;br /&gt;
&lt;br /&gt;
This task corresponds to ticket 132: https://avcodec.org/trac/ffmpeg/ticket/132&lt;br /&gt;
&lt;br /&gt;
''I am ready to help out with understanding MPlayer's demuxer, esp. MPlayer API stuff if necessary.&lt;br /&gt;
--[[User:Reimar|Reimar]] 15:46, 1 March 2008 (EST)&lt;br /&gt;
&lt;br /&gt;
=== Port missing demuxers from MPlayer ===&lt;br /&gt;
MPlayer supports a few container formats in libmpdemux that are not yet present in libavformat. Porting them over and gettting them relicensed as LGPL or reimplementing them from scratch should make reasonable small tasks.&lt;br /&gt;
&lt;br /&gt;
# TiVo --&lt;br /&gt;
# VIVO -- ''Daniel Verkamp has a patch for this''&lt;br /&gt;
# SL support for MPEG-TS (anyone got samples?)&lt;br /&gt;
# MNG -- ''Paul B Mahol is working on this''&lt;br /&gt;
&lt;br /&gt;
=== Optimal Huffman tables for (M)JPEG ===&lt;br /&gt;
This task is outlined at http://guru.multimedia.cx/small-tasks-for-ffmpeg/ and is tracked in the issue tracker: http://roundup.libav.org/issue267&lt;br /&gt;
&lt;br /&gt;
=== M95 Playback System ===&lt;br /&gt;
This task is to implement a playback subsystem for [[M95]] files. This will entail writing a new file demuxer and video decoder (the audio is already uncompressed), both of which should be fairly easy by FFmpeg standards. [[M95|The M95 page]] contains the specs necessary to complete this task and points to downloadable samples.&lt;br /&gt;
&lt;br /&gt;
=== BRP Playback System ===&lt;br /&gt;
This task is to implement a playback subsystem for [[BRP]] files. This will entail writing a new file demuxer as well as a video decoder that can handle at least 2 variations of format data. Further, write an audio decoder for the custom DPCM format in the file. All of these tasks are considered fairly easy by FFmpeg standards. [[BRP|The BRP page]] contains the specs necessary to complete this task and points to downloadable samples for both known variations.&lt;br /&gt;
&lt;br /&gt;
=== 16-bit VQA Video Decoder ===&lt;br /&gt;
Westwood [[VQA]] files are already supported. However, there are three variations of its custom video codec, only the first two of which are supported. This task involves implementing support for the third variation. Visit the VQA samples repository: http://samples.multimedia.cx/game-formats/vqa/ -- The files in the directories Tiberian Sun VQAs/, bladerunner/, and dune2000/ use the 3rd variation of this codec. The [[VQA|VQA page]] should link to all the details you need to support this format.&lt;br /&gt;
&lt;br /&gt;
Discussion/patch:&lt;br /&gt;
http://thread.gmane.org/gmane.comp.video.ffmpeg.devel/89902/focus=90433&lt;br /&gt;
&lt;br /&gt;
=== HNM4 Playback System ===&lt;br /&gt;
This task is to implement a playback subsystem for [[HNM4]] variant of the [[HNM]] format. This will entail writing a new file demuxer and video decoder, both of which are considered fairly easy by FFmpeg standards. [[HNM4|The HNM4 page]] contains the specs necessary to complete this task and links to downloadable samples.&lt;br /&gt;
&lt;br /&gt;
=== Apple RPZA encoder ===&lt;br /&gt;
A patch was once sent to the ffmpeg-devel mailing list to include an encoder for the [[Apple RPZA]] video codec. That code can be found on the &amp;quot;[[Interesting Patches]]&amp;quot; page. This qualification task involves applying that patch so that it can compile with current HEAD and then cleaning it up per the standards of the project. Engage the mailing list to learn more about what to do.&lt;br /&gt;
&lt;br /&gt;
=== QuickTime Edit List Support ===&lt;br /&gt;
Implement edit list support in the QuickTime demuxer (libavformat/mov.c). This involves parsing the 'elst' atom in a QuickTime file. For a demonstration of how this is a problem, download the file menace00.mov from http://samples.mplayerhq.hu/mov/editlist/ and play it with ffplay or transcode it with ffmpeg. Notice that the audio and video are ever so slightly out of sync. Proper edit list support will solve that. Other samples in that directory also presumably exhibit edit list-related bugs. The [http://xine.cvs.sourceforge.net/xine/xine-lib/src/demuxers/demux_qt.c?view=markup Xine demuxer] has support for this, it might be useful for hints.&lt;br /&gt;
&lt;br /&gt;
(patch was submitted to ffmpeg-devel , around 14 March 2009) &lt;br /&gt;
&lt;br /&gt;
=== Add wma fixed point decoder back into libavcodec ===&lt;br /&gt;
http://svn.rockbox.org/viewvc.cgi/trunk/apps/codecs/libwma/&lt;br /&gt;
Rockbox's fixed-point WMA decoder was adapted from the decoder in libavcodec.&lt;br /&gt;
&lt;br /&gt;
=== VC1 timestamps in m2ts ===&lt;br /&gt;
&lt;br /&gt;
Codec copy of VC1 from m2ts currently doesn't work. Either extend the VC1 parser to output/fix timestamps, or fix the timestamps from m2ts demuxing.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== flip flag for upside-down codecs ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;about the flip, a patch that decodes images fliped when&lt;br /&gt;
codec_tag == ff_get_fourcc(&amp;quot;GEOX&amp;quot;) is welcome.&lt;br /&gt;
its a metter of 2lines manipulating data/linesize of imgages after&lt;br /&gt;
get_buffer() or something similar&lt;br /&gt;
[...]&lt;br /&gt;
-- &lt;br /&gt;
Michael     GnuPG fingerprint: 9FF2128B147EF6730BADF133611EC787040B0FAB&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
more info:&lt;br /&gt;
http://roundup.ffmpeg.org/roundup/ffmpeg/issue741&lt;br /&gt;
&lt;br /&gt;
=== lavf-based concatenation tool ===&lt;br /&gt;
&lt;br /&gt;
It would be nice to have some libavformat-based tool that would extract frames from multiple files (possible different containers as well) and put them into single one.&lt;br /&gt;
&lt;br /&gt;
=== vcr1 encoder ===&lt;br /&gt;
According to this: http://lists.mplayerhq.hu/pipermail/ffmpeg-devel/2009-February/063555.html vcr1 encoder is disabled, and won't compile if enabled.  Michael would prefer to keep it around, and have someone grow it into full encoder.&lt;br /&gt;
&lt;br /&gt;
=== implement some colorspace fourcc/codecs ===&lt;br /&gt;
some colorspace formats were uploaded to http://samples.mplayerhq.hu/V-codecs/&lt;br /&gt;
including:&lt;br /&gt;
 CYUV.AVI is 8 Bit Interleaved 4:2:2&lt;br /&gt;
 a12v.avi is 4:2:2:4 10 Bit Interleaved&lt;br /&gt;
 auv2.avi is 4:2:2:4 8 Bit Interleaved&lt;br /&gt;
 and V-codecs/yuv8/MAILTEST.AVI .&lt;br /&gt;
&lt;br /&gt;
it might decode with current pixfmts, a sample commit is 9853bbb21a19d540850de60d3e9cf7c6ef9da7dc&lt;br /&gt;
&lt;br /&gt;
a sample commit for adding new input formats to swscale is 4884b9e50d416f84e64bfaf546a03e490cb83a2f&lt;br /&gt;
 the hunks 3 and 5 you do not need, they are optional special converters&lt;br /&gt;
 also the change to isSupportedOut() you do not need&lt;br /&gt;
 above will add a new input format&lt;br /&gt;
&lt;br /&gt;
another example for adding an input format is a43fb6b37efa5b01f2c9bdc414570691229bcfab&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Implement Phantom Cine demuxer and Bayer format support for swscale ===&lt;br /&gt;
The format is described here:&lt;br /&gt;
http://wiki.multimedia.cx/index.php?title=Phantom_Cine&lt;br /&gt;
It will need support for Bayer -&amp;gt; RGB conversion in swscale to make the demuxer useful though.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== support for [[YCoCg]]/RGB colorspace in FFV1 ===&lt;br /&gt;
Add support for [[YCoCg]] and [[RGB]] encoded sources for the [[FFV1]] codec&lt;br /&gt;
&lt;br /&gt;
This would add a free lossless intra-frame RGB codec for all supported platforms (most important OS X + Windows) which is often asked for video editing in video forums (e.g. slashcam.de)&lt;br /&gt;
&lt;br /&gt;
=== [[IFF#ANIM|IFF ANIM]] decoder ===&lt;br /&gt;
Modify libavformat/iff.c to handle this chunk and write a decoder for the format. The wiki page at [[IFF#ANIM|IFF ANIM]] has links to more information and source code. Samples can be found at http://www-user.tu-chemnitz.de/~womar/projects/iffanim/iffanim_samplepack.zip .&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== port missing decoders/demuxers from other open source projects. ===&lt;br /&gt;
&lt;br /&gt;
http://www.mega-nerd.com/libsndfile/#Features&lt;br /&gt;
 Paris Audio File PAF&lt;br /&gt;
 IRCAM SF&lt;br /&gt;
 GNU Octave 2.0 MAT4&lt;br /&gt;
 GNU Octave 2.1 MAT5&lt;br /&gt;
 Portable Voice Format PVFSound&lt;br /&gt;
 Designer II SD2&lt;br /&gt;
samples are here: http://www.mega-nerd.com/tmp/SoundFileCollection-20050711-0902.tgz&lt;br /&gt;
&lt;br /&gt;
http://www.hawksoft.com/hawkvoice/&lt;br /&gt;
 HVDI_VOICE_DATA- packet&lt;br /&gt;
 [[GSM]]&lt;br /&gt;
 LPC&lt;br /&gt;
 CELP&lt;br /&gt;
 LPC10&lt;br /&gt;
&lt;br /&gt;
http://sourceforge.net/projects/vgmstream&lt;br /&gt;
 150+ formats: http://vgmstream.svn.sourceforge.net/viewvc/vgmstream/readme.txt&lt;br /&gt;
&lt;br /&gt;
http://www.imagemagick.org&lt;br /&gt;
http://www.graphicsmagick.org/formats.html&lt;br /&gt;
 many image formats not supported yet.&lt;br /&gt;
&lt;br /&gt;
http://gpac.sourceforge.net/&lt;br /&gt;
 [[MPEG-4 BIFS]]&lt;br /&gt;
 3GPP DIMS&lt;br /&gt;
 [[LASeR]]&lt;br /&gt;
 SAF&lt;br /&gt;
 SVG&lt;br /&gt;
 [[Synchronized Multimedia Integration Language|SMIL]]&lt;br /&gt;
 VRML&lt;br /&gt;
 X3D&lt;br /&gt;
 XMT&lt;br /&gt;
&lt;br /&gt;
http://adplug.sourceforge.net/&lt;br /&gt;
http://adplug.sourceforge.net/library/&lt;br /&gt;
 many OPL2/OPL3 audio formats not supported yet.&lt;br /&gt;
&lt;br /&gt;
http://mikmod.raphnet.net/&lt;br /&gt;
http://mikmod.raphnet.net/#features&lt;br /&gt;
 many music pattern formats not supported yet.&lt;br /&gt;
&lt;br /&gt;
http://www.fly.net/~ant/libs/audio.html#Game_Music_Emu&lt;br /&gt;
 AY&lt;br /&gt;
 GBS&lt;br /&gt;
 GYM&lt;br /&gt;
 HES&lt;br /&gt;
 KSS&lt;br /&gt;
 NSF, NSFE&lt;br /&gt;
 SAP&lt;br /&gt;
 [[SNES-SPC700 Sound Format]]&lt;br /&gt;
 VGM, VGZ&lt;br /&gt;
&lt;br /&gt;
=== libswscale PAL8 output ===&lt;br /&gt;
&lt;br /&gt;
See the thread: &amp;quot;[RFC] libswscale palette output implementation&amp;quot;:&lt;br /&gt;
http://thread.gmane.org/gmane.comp.video.ffmpeg.devel/101397&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== vloopback output support ===&lt;br /&gt;
&lt;br /&gt;
vloopback is a linux kernel device which allows to create a virtual video device where&lt;br /&gt;
programs can write, and can be accessed as a normal video device:&lt;br /&gt;
http://www.lavrsen.dk/twiki/bin/view/Motion/VideoFourLinuxLoopbackDevice&lt;br /&gt;
&lt;br /&gt;
This would allow to write the ffmpeg output to a vloopdevice and be displayed by some a&lt;br /&gt;
program reading from such device (e.g. skype, a voip client etc.).&lt;br /&gt;
&lt;br /&gt;
An example of a program which uses vloopback:&lt;br /&gt;
http://www.ws4gl.org/&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Port video filters from MPlayer/VLC/Mjpegtools/Effectv/etc etc to libavfilter ===&lt;br /&gt;
&lt;br /&gt;
There are plenty programs providing their own filters, many of them may be easily ported to the &lt;br /&gt;
superior ;-) framework of libavfilter. Also may be possible to create wrappers around other libraries&lt;br /&gt;
(e.g. opencv, libgimp, libshowphoto, libaa).&lt;br /&gt;
&lt;br /&gt;
=== rar/zip/gz/bz2 etc demuxer ===&lt;br /&gt;
there are still compressed files out there (zipped raw wav, multi-rar'ed videos etc). create a decompression demuxer for them.&lt;br /&gt;
&lt;br /&gt;
=== Less common AAC decoder features ===&lt;br /&gt;
&lt;br /&gt;
Add support to the AAC decoder for object type ER AAC LC or AAC LC 960.&lt;br /&gt;
&lt;br /&gt;
=== arithmetic decoding (and encoding) for mjpeg ===&lt;br /&gt;
Following marker codes are not supported by our mjpeg decoder:&lt;br /&gt;
DAC, SOF9, SOF10, SOF11, SOF13, SOF14 and SOF15.&lt;br /&gt;
&lt;br /&gt;
=== [[Enhanced Variable Rate Codec]] decoder ===&lt;br /&gt;
samples and specs available.&lt;br /&gt;
&lt;br /&gt;
=== adobe http f4f segmented fragmentation dynamic streaming format ===&lt;br /&gt;
sample streams on http://www.fox.com . command line instructions for creating such files: http://help.adobe.com/en_US/HTTPStreaming/1.0/Using/WS9463dbe8dbe45c4c-c126f3b1260533756d-7ffc.html . spec is available under adobe NDA. not to be confused with freely available F4V specification. open source php to convert f4f to flv: https://github.com/svnpenn/dotfiles/blob/master/etc/AdobeHDS.php&lt;br /&gt;
&lt;br /&gt;
=== get 3IV1 decoder working and benchmark ===&lt;br /&gt;
we have a decoder for 3IV1. its currently if 0'd in the mpeg4 decoder. your task is to test if it still builds and works, and fix it so that it does not slow down mpeg4 decoder if enabled. the end goal is to enable it by default.&lt;br /&gt;
&lt;br /&gt;
== Reverse Engineering Tasks ==&lt;br /&gt;
&lt;br /&gt;
=== Demuxer for csf format and video codec ===&lt;br /&gt;
This is partially analyzed in http://ffmpeg.org/trac/ffmpeg/ticket/1060&lt;br /&gt;
&lt;br /&gt;
=== realplayer's ivr format and create a demuxer for it ===&lt;br /&gt;
samples on [[IVR]] page.&lt;br /&gt;
&lt;br /&gt;
=== pick a random binary codec from mplayer ===&lt;br /&gt;
[[MPlayer]] has over 100 binary codecs which have no opensource decoder. pick one, find a sample and try to reverse engineer it. note that some work has been done on some codecs, and its a good idea to ask on the mailing list before starting.&lt;br /&gt;
&lt;br /&gt;
=== emblaze demuxer/decoder from java code ===&lt;br /&gt;
samples and java decoder: http://samples.mplayerhq.hu/internets/emblaze/&lt;br /&gt;
&lt;br /&gt;
== Non-Programming Tasks ==&lt;br /&gt;
&lt;br /&gt;
=== Check Linux distributions for patches to ffmpeg ===&lt;br /&gt;
&lt;br /&gt;
check various distros like Fedora, Ubuntu, Debian, Mint, Arch, Suse etc for patches to ffmpeg. write down location of patches so it can be checked on an annual basis. if patches are found , report to ffmpeg-devel mailing list or bug trac.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== improve layout and accessability of ffmpeg website ===&lt;br /&gt;
&lt;br /&gt;
test ffmpeg.org with various browsers, including screen readers and get it optimized and available to people with poor vision. check wording and general ease of use. for example putting large download links for users like vlc and firefox have.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== restore and update compatability page on ffmpeg website ===&lt;br /&gt;
&lt;br /&gt;
we used to have a page that detailed how to create files for other software players and operating systems. restore this page from git history and update it for new devices and standardized codecs (h264 is the preferred codec now).&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== contact large ffmpeg users for broken / unplayable samples ===&lt;br /&gt;
contact the largest users of ffmpeg, like youtube, facebook, archive.org, blip.tv and others and ask them for access to samples that do not decode correctly.&lt;br /&gt;
&lt;br /&gt;
=== review sample request error messages ===&lt;br /&gt;
ffmpeg has an av_log_ask_for_sample generic log message to ask the user for a sample when there is a problem. your task is to review ffmpeg decoders and demuxers (and possibly other inputs) and replace regular av_log messages requesting samples with it. example commit here: http://ffmpeg.org/pipermail/ffmpeg-cvslog/2011-April/036509.html&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== write bluray and 3d howtos ===&lt;br /&gt;
write a document or wiki article or just link to some info on how to play + encode + rip bluray using ffmpeg/mplayer and the various bluray libs required. also a guide on how to use ffmpeg/mplayer/vlc to encode and play various 3D formats. including maybe some supported hardware screens/video cards w/ examples. includes updating this wiki page [[Blu Ray and HD-DVD Playback Status]]&lt;br /&gt;
[[How to make a 3d movie with ffmpeg]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== write fate breakage notification bot ===&lt;br /&gt;
script up a bot to scan http://fate.ffmpeg.org that tells #ffmpeg-devel when something breaks compilation in fate.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== write ipad/iphone/ios howto ===&lt;br /&gt;
write up some documentation on how to compile ffmpeg/ffplay/ffserver for iOS. exact tool versions, command lines, library requirements. both native compilation on the device and cross-compile using OS X. an android howto for various devices would be useful too.&lt;br /&gt;
&lt;br /&gt;
[[Category:FFmpeg]]&lt;br /&gt;
[[Category:Libav]]&lt;/div&gt;</summary>
		<author><name>Reimar</name></author>
	</entry>
	<entry>
		<id>https://wiki.multimedia.cx/index.php?title=FFmpeg_Summer_of_Code_2012&amp;diff=14020</id>
		<title>FFmpeg Summer of Code 2012</title>
		<link rel="alternate" type="text/html" href="https://wiki.multimedia.cx/index.php?title=FFmpeg_Summer_of_Code_2012&amp;diff=14020"/>
		<updated>2012-03-27T19:02:52Z</updated>

		<summary type="html">&lt;p&gt;Reimar: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;FFmpeg Summer of Code Proposals and Qualification Tasks.&lt;br /&gt;
&lt;br /&gt;
==Timeline==&lt;br /&gt;
March 12th-16th: Project application evaluation.&lt;br /&gt;
&lt;br /&gt;
NOTE: FFmpeg has not been accepted this year, so unless you want to do any of these, the small tasks or qualification tasks just for fun...&lt;br /&gt;
&lt;br /&gt;
==Contacting Developers==&lt;br /&gt;
Find us on irc, server: irc.freenode.net channel: #ffmpeg-devel&lt;br /&gt;
or contact us by subscribing to the mailing list https://lists.ffmpeg.org/mailman/listinfo/ffmpeg-devel/&lt;br /&gt;
&lt;br /&gt;
== Qualification Tasks ==&lt;br /&gt;
To be eligible for a Summer of Code project, we ask you to do a small programming task to prove you know the basics. FFmpeg is a large, complicated collection of code and it's not easy for beginners. There are some ideas for tasks on the [[Small FFmpeg Tasks]] page.&lt;br /&gt;
&lt;br /&gt;
== 1st Tier Project Proposals ==&lt;br /&gt;
These are proposals with a mentor attached.&lt;br /&gt;
&lt;br /&gt;
''Baptiste has also offered to mentor.''&lt;br /&gt;
&lt;br /&gt;
=== hwaccel: global architecture ===&lt;br /&gt;
&lt;br /&gt;
Revisit my older v2 proposal to completely get rid of HW pixel formats. The advantages were: this simplifies things, clean ups code, allows to move more code to the codec library (FFmpeg), allow for fallbacks to SW decoding (at the stream level if HW cannot meet the requirements, not while decoding a particular stream). The key point was to separate identification of the pixel format from the underlying HW accelerator. This will also allow for reading HW surfaces back, when requested if the selected pixel format.&lt;br /&gt;
&lt;br /&gt;
Add encoding and post processing infrastructure.&lt;br /&gt;
&lt;br /&gt;
* VA-API (hwaccel)&lt;br /&gt;
&lt;br /&gt;
:Codecs related:&lt;br /&gt;
 + Add support {,M}JPEG decoding&lt;br /&gt;
 + Add support for VC-1 interlaced acceleration&lt;br /&gt;
 + Add support for H.264 interlaced&lt;br /&gt;
&lt;br /&gt;
:Tools related:&lt;br /&gt;
:Add VA support to ffplay (see my older patches)&lt;br /&gt;
  + requires some VO infrastructure work&lt;br /&gt;
  + requires hwaccel hooks&lt;br /&gt;
  + requires enabling vaapi, dxva, bcm, tidsp, whatever&lt;br /&gt;
&lt;br /&gt;
* Add VA support to ffmpeg&lt;br /&gt;
  + requires VA/DRI API, for no X dependency&lt;br /&gt;
  + dependencies: hwaccel v2 to construct pipelines (HW-&amp;gt;HW, HW-&amp;gt;SW, SW-&amp;gt;HW, etc.), or at least allow for VA surface readback wherever necessary.&lt;br /&gt;
&lt;br /&gt;
* VDPAU (hwaccel)&lt;br /&gt;
&lt;br /&gt;
Update and push my older hwaccel-based VDPAU code. It should still work. :)&lt;br /&gt;
:Priority: hwaccel v2 infrastructure work first, ffplay enabling second.&lt;br /&gt;
&lt;br /&gt;
''Mentor: Gwenole Beauchesne''&lt;br /&gt;
&lt;br /&gt;
=== glplay ===&lt;br /&gt;
Add OpenGL output to ffplay, this should allow for better performance (and less bugs at least for some hardware / driver combinations). This could be a new application (glplay), but it is probably simpler to extend ffplay to use OpenGL. You can use code from MPlayer's OpenGL vo module which you may relicense under the LGPL.&lt;br /&gt;
&lt;br /&gt;
''Mentor: Reimar Döffinger''&lt;br /&gt;
&lt;br /&gt;
=== Improve the audio resampling/rematrixing/converting code ===&lt;br /&gt;
&lt;br /&gt;
* Right now, we're using libswresample to resample/rematrix audio (samplerate / channels) and to resample the audio format (int, float, 16-bit, 32-bit).&lt;br /&gt;
* Both interleaved and planar audio sample formats are already supported&lt;br /&gt;
* We need SIMD optimization of popular conversions (float-int16, int16-float), (stereo-mono-5.1) and anything else that's frequently used.&lt;br /&gt;
* We need support for alternate conversion functions (e.g. sample format conversion with or without dithering)&lt;br /&gt;
* Fix bugs in current design (none known but there sure are some)&lt;br /&gt;
&lt;br /&gt;
''Mentor: [[User:Michael|Michael Niedermayer]]''&lt;br /&gt;
&lt;br /&gt;
=== Implement a H.265 / High efficiency video coding (HEVC) decoder ===&lt;br /&gt;
&lt;br /&gt;
* Write a basic decoder supporting I, P, and, if time permits, B slices.&lt;br /&gt;
* It does not need to be ASM/SIMD optimized but its high level structure must permit such optimizations to be easily added later.&lt;br /&gt;
* As a qualification task you need to implement parsing headers and maybe a bit beyond that to demonstrate that you are qualified and understand the HEVC specification. This project requires a solid understanding of video coding and C, it's not something for the average SOC student.&lt;br /&gt;
* A draft spec is available at: http://phenix.it-sudparis.eu/jct/doc_end_user/documents/8_San%20Jose/wg11/JCTVC-H1003-v21.zip&lt;br /&gt;
&lt;br /&gt;
''Mentor: [[User:Michael|Michael Niedermayer]]''&lt;br /&gt;
&lt;br /&gt;
=== H.264 MVC ===&lt;br /&gt;
&lt;br /&gt;
* Add MVC support to our H.264 decoder. MVC is used in 3D Blu-Rays. &lt;br /&gt;
* As qualification you have to do some work that demonstrates your understanding of MVC and that is a subpart of the whole MVC implementation.&lt;br /&gt;
&lt;br /&gt;
''Mentor: [[User:Michael|Michael Niedermayer]]''&lt;br /&gt;
&lt;br /&gt;
=== Libavfilter extension ===&lt;br /&gt;
&lt;br /&gt;
Libavfilter is the FFmpeg filtering library that started as a 2007 SoC [[FFmpeg Summer Of Code#Video Filter API (AKA libavfilter)|project]]. It currently supports audio and video filtering and generation support.&lt;br /&gt;
&lt;br /&gt;
The task would consist of writing or porting audio and video filters and eventually fix/extend libavfilter API and design.&lt;br /&gt;
&lt;br /&gt;
In particular the work may focus on porting MPlayer filters which are currently integrated through the mp wrapper.&lt;br /&gt;
For each port the student should verify that the new filter produces the same output (by comparing the output generated by -vf mp=FILTER and -vf FILTER) and checking that the new integrated filter is not slower.&lt;br /&gt;
&lt;br /&gt;
Prerequisites: good C coding skills, familiarity with git/source code control systems, having some background on DSP and image/sound processing techniques would be a bonus but is not strictly required.&lt;br /&gt;
&lt;br /&gt;
For getting more ideas read also the [[FFmpeg_/_Libav_Summer_Of_Code_2011#Libavfilter_video_work|GSoC 2011 libavfilter video proposal]] and [https://ffmpeg.org/trac/ffmpeg/query?status=new&amp;amp;status=open&amp;amp;status=reopened&amp;amp;component=avfilter&amp;amp;col=id&amp;amp;col=summary&amp;amp;col=status&amp;amp;col=type&amp;amp;col=priority&amp;amp;col=component&amp;amp;col=version&amp;amp;order=priority trac libavfilter tickets].&lt;br /&gt;
&lt;br /&gt;
Qualification task: a port or a new implementation of one or more filters.&lt;br /&gt;
&lt;br /&gt;
''Mentor: Stefano Sabatini - saste on IRC (possibly with a backup mentor)''.&lt;br /&gt;
&lt;br /&gt;
=== Bayer colorspace formats ===&lt;br /&gt;
&lt;br /&gt;
Several image and video format store pixels using Bayer-pattern colorspaces. Supporting these format would broaden FFmepg's applicability to RAW still and video photography processing. Tasks:&lt;br /&gt;
* Implement bayer transformations in libswscale (plain C)&lt;br /&gt;
* Add bayer formats to the libavutil pixfmt enumeration routines&lt;br /&gt;
* Extend TIFF decoder to support DNG-Bayer format&lt;br /&gt;
* Complete PhotoCINE demuxer to support Bayer format; (or another format of your choosing)&lt;br /&gt;
* SIMD optimizations of the libswscale transformations&lt;br /&gt;
* Decoders/specs may be available in the [[Dcraw]] project&lt;br /&gt;
&lt;br /&gt;
Qualification task: TBD&lt;br /&gt;
&lt;br /&gt;
''Mentor: [[User:Suxen_drol|Peter Ross]]''&lt;br /&gt;
&lt;br /&gt;
=== Extend image formats support ===&lt;br /&gt;
&lt;br /&gt;
Improve FFmpeg support for image formats, adding missing formats (e.g. XPM) and extending support for the current ones (e.g. animated GIF, GIF compression, fix PNG todos, add support to animated PNG) etc.&lt;br /&gt;
&lt;br /&gt;
Qualification task: TBD (possibly finally fixing and integrating Måns' zlib decoder that has been unmerged since ages? Or just starting with some small part of the task itself, or implementing format autodetection for imagepipe demuxer)&lt;br /&gt;
&lt;br /&gt;
''Mentor: [[User:reimar|Reimar Döffinger]]''&lt;br /&gt;
&lt;br /&gt;
=== DTS-HD decoder ===&lt;br /&gt;
&lt;br /&gt;
* ETSI released specifications (http://www.etsi.org/deliver/etsi_ts/102100_102199/102114/01.03.01_60/ts_102114v010301p.pdf). Your job is to complete the existing decoder with the following features.&lt;br /&gt;
&lt;br /&gt;
 (1) Add support for mixed Core + DTS-HD stream structure&lt;br /&gt;
     (DtsCoreFrame+DtsHdFrame+DtsCoreFrame+DtsHdFrame+...), used by Blu-Ray main&lt;br /&gt;
     and commentary tracks.&lt;br /&gt;
 (2) Add support for XXCh extension (6.1 and 7.1 channels).&lt;br /&gt;
 (3) Add support for X96 extension (96khz).&lt;br /&gt;
 (4) Add support for XLL extension (lossless).&lt;br /&gt;
 (5) Add support for a pure DTS-HD stream structure&lt;br /&gt;
     (DtsHdFrame+DtsHdFrame+DtsHdFrame+...), used by Blu-Ray PiP tracks.&lt;br /&gt;
 (6) Add support for XBR extension (extra bitrate).&lt;br /&gt;
&lt;br /&gt;
''Mentor: Benjamin Larsson''&lt;br /&gt;
&lt;br /&gt;
=== Libavdevice API ===&lt;br /&gt;
Currently libavdevice use libavformat API. This is far from being useful for real-time video rendering. Your task is to write new API which resolves this and similar issues and port all already present input and output devices to the new API.&lt;br /&gt;
&lt;br /&gt;
New API should have:&lt;br /&gt;
 * Support for output video devices which do not provide own scaler.&lt;br /&gt;
 * Minimal latency.&lt;br /&gt;
&lt;br /&gt;
Qualification task: write one new video outdev, for example any of, but not limited to: aalib, caca, x11, xv, xover, vesa or gl2 using current libavdevice API.&lt;br /&gt;
&lt;br /&gt;
''Mentor: [[User:Pbm|Paul B Mahol]]''&lt;br /&gt;
&lt;br /&gt;
=== Redesigning the protocol reads ===&lt;br /&gt;
Eliminate active polling while allowing interrupts, and allowing selection on multiple handles.&lt;br /&gt;
&lt;br /&gt;
''Mentor: Nicolas George''&lt;br /&gt;
&lt;br /&gt;
== 2nd Tier (need mentor) Project Proposals ==&lt;br /&gt;
Some of the following proposals are also proposed by other organizations, we will try to coordinate this with them so as to avoid duplicate work.&lt;br /&gt;
We are also happy to hear your personal project ideas ...&lt;br /&gt;
&lt;br /&gt;
=== AAC decoder improvements ===&lt;br /&gt;
Our AAC decoder does not support low-delay. Part of this task will be to also finish last year's BSAC task. A possible qualification task is to fix a crash in the current BSAC code with one of the samples from the BSAC testing suite.&lt;br /&gt;
:Sample: https://ffmpeg.org/trac/ffmpeg/ticket/113&lt;br /&gt;
&lt;br /&gt;
=== AAC encoder improvements ===&lt;br /&gt;
Our AAC encoder does not produce competitive quality per bitrate.&lt;br /&gt;
Improve the encoder to be better than some other commonly used encoder like libfaac.&lt;br /&gt;
This requires solid understanding of things like psychoacoustics and rate distortion.&lt;br /&gt;
A qualification task for this could be to improve the encoder by at least 5% bitrate at the same quality&lt;br /&gt;
measured by some objective measure.&lt;br /&gt;
&lt;br /&gt;
=== FF Fuzzer ===&lt;br /&gt;
Write a system like FATE that fuzzes and tests these fuzzed multimedia files under address sanitizer and valgrind with ffmpeg and ffplay.&lt;br /&gt;
When a crash, or other anomaly is found, it would use git bisect to identify which exact commit introduced the bug.&lt;br /&gt;
And either display this via some web frontend (similar to fate.ffmpeg.org) or just automatically send an email to some&lt;br /&gt;
dedicated mailing list. The system has to be robust (there will be infinite loops, OOM conditions and randomly occurring&lt;br /&gt;
crashes). Its also important that the system is easy to maintain and can filter out duplicates of the same issue.&lt;br /&gt;
&lt;br /&gt;
=== VC1 interlaced ===&lt;br /&gt;
FFmpeg has code for interlaced VC1, but nearly all samples still do not decode correctly. The task is to finish last year's project. You should be able to find a possible qualification task by testing interlaced samples.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== MPEG-4 ALS Roundup ===&lt;br /&gt;
&lt;br /&gt;
This task is to update and enhance the existing ALS decoder as well as integrate&lt;br /&gt;
and enhance the rudimentary encoder found at:&lt;br /&gt;
https://github.com/justinruggles/FFmpeg-alsenc&lt;br /&gt;
&lt;br /&gt;
Possible features are:&lt;br /&gt;
&lt;br /&gt;
* implement rls-lms in the decoder&lt;br /&gt;
* do correct channel layout/sort handling in the decoder&lt;br /&gt;
* update to current master&lt;br /&gt;
* use codec private options&lt;br /&gt;
* implement encode2(), setting pts and duration&lt;br /&gt;
* document options and examples in encoders.texi&lt;br /&gt;
* come up with a good set of encoding tests for FATE&lt;br /&gt;
* implement mcc/channel sort in the encoder&lt;br /&gt;
* implement rls-lms in the encoder&lt;br /&gt;
* implement float support&lt;br /&gt;
&lt;br /&gt;
=== Fix and improve FFserver ===&lt;br /&gt;
FFserver has been part of FFmpeg since a long time but due to lack of a motivated maintainer its a bit buggy.&lt;br /&gt;
For this project you would have to debug and fix many bugs. It requires good skills at reading and understanding other peoples code.&lt;br /&gt;
As a qualification task you will have to write functioning regression tests for FFserver which implicates some bugfixing to make ffserver produce the same output on all supported platforms.&lt;br /&gt;
&lt;br /&gt;
=== Reverse engineer [[TAK]] codec and write decoder for it ===&lt;br /&gt;
TAK format is partially already documented.&lt;br /&gt;
You need to revisit that documentation and update and/or fix any missing/wrong information.&lt;br /&gt;
Reverse engineer codec and write working bitexact decoder.&lt;br /&gt;
&lt;br /&gt;
=== Support for more subtitle formats ===&lt;br /&gt;
We have libass support now, either a parser (from mplayer) to convert subs into ass or something else.&lt;br /&gt;
* http://mailman.videolan.org/pipermail/vlc-devel/2011-September/081803.html&lt;br /&gt;
* http://ale5000.altervista.org/subtitles.htm&lt;br /&gt;
&lt;br /&gt;
=== MKV ordered chapters / playlist support ===&lt;br /&gt;
Get playlist stuff into ffmpeg. Playlist is blocking a few things like quicktime edit list and .asx / .pls files.&lt;br /&gt;
&lt;br /&gt;
=== Adobe fragmented http in/out ===&lt;br /&gt;
Adobe has a new streaming format.&lt;br /&gt;
* info/spec: http://www.adobe.com/products/httpdynamicstreaming/&lt;br /&gt;
&lt;br /&gt;
=== libavfilter 9/10bit support ===&lt;br /&gt;
Make filters work with higher bitrate codecs/colorspaces&lt;br /&gt;
&lt;br /&gt;
=== Fix copying video between formats ===&lt;br /&gt;
Lots of h264 streams in flv, mp4, mkv, mpegts. People wish to remux these into various formats so it works on their hardware (ps3, ipod etc).&lt;br /&gt;
 https://ffmpeg.org/trac/ffmpeg/ticket/796&lt;br /&gt;
 https://ffmpeg.org/trac/ffmpeg/ticket/822&lt;br /&gt;
 https://ffmpeg.org/trac/ffmpeg/ticket/954&lt;br /&gt;
 https://ffmpeg.org/trac/ffmpeg/ticket/976&lt;br /&gt;
&lt;br /&gt;
=== Extend paletted format support ===&lt;br /&gt;
&lt;br /&gt;
Cleanup framework for handling better with paletted format, write a posterize filter, add support to libswscale palette output (possibly making use of libavcodec/elbg), add support for reading and saving a palette to a file and apply them to the input video (e.g. by creating ad-hoc filters).&lt;br /&gt;
&lt;br /&gt;
=== Port formats/colorspace support from dcraw or make dcraw wrapper ===&lt;br /&gt;
[[Dcraw]] supports many raw camera formats that ffmpeg may not. Port or make a wrapper for this project.&lt;br /&gt;
&lt;br /&gt;
[[Category:FFmpeg]]&lt;/div&gt;</summary>
		<author><name>Reimar</name></author>
	</entry>
	<entry>
		<id>https://wiki.multimedia.cx/index.php?title=CSF&amp;diff=13993</id>
		<title>CSF</title>
		<link rel="alternate" type="text/html" href="https://wiki.multimedia.cx/index.php?title=CSF&amp;diff=13993"/>
		<updated>2012-03-14T16:44:36Z</updated>

		<summary type="html">&lt;p&gt;Reimar: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Currently discussed and sample here: https://ffmpeg.org/trac/ffmpeg/ticket/1060&lt;br /&gt;
Copy-and-pasted description:&lt;br /&gt;
It is a format based on PNG and partial re-encoding, as you can see when you look at it with e.g.  http://sourceforge.net/projects/extractor-gtk/&lt;br /&gt;
Possibly contains some other encoding types, too.&lt;br /&gt;
All container structures seem to be little-endian.&lt;br /&gt;
Each &amp;quot;frame&amp;quot; contains a 4 byte size at offset 8.&lt;br /&gt;
For the PNG encoded ones, offsets 16 - 24 seem to contain 4 16 bit values that indicate the offset and size of the image the PNG updates. After that seems to follow yet another size field containing the remaining bytes.&lt;br /&gt;
Start of the following packet is always aligned to 16 bytes.&lt;br /&gt;
There seems to be some kind of index at 0x190.&lt;br /&gt;
This and the data start seems to be referenced in the header, at bytes 24 and 28.&lt;/div&gt;</summary>
		<author><name>Reimar</name></author>
	</entry>
	<entry>
		<id>https://wiki.multimedia.cx/index.php?title=FFmpeg_Summer_of_Code_2012&amp;diff=13950</id>
		<title>FFmpeg Summer of Code 2012</title>
		<link rel="alternate" type="text/html" href="https://wiki.multimedia.cx/index.php?title=FFmpeg_Summer_of_Code_2012&amp;diff=13950"/>
		<updated>2012-03-11T08:52:26Z</updated>

		<summary type="html">&lt;p&gt;Reimar: /* extend image formats support */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;:Please detail the deadlines for applying and being a student/mentor.&lt;br /&gt;
&lt;br /&gt;
FFmpeg Summer of Code Proposals and Qualification Tasks.&lt;br /&gt;
&lt;br /&gt;
==Timeline==&lt;br /&gt;
March 12th-16th: Project application evaluation.&lt;br /&gt;
&lt;br /&gt;
=Contacting Devels=&lt;br /&gt;
Find us on irc, server: irc.freenode.net channel: #ffmpeg-devel&lt;br /&gt;
or contact us by subscribing to the mailing list https://lists.ffmpeg.org/mailman/listinfo/ffmpeg-devel/&lt;br /&gt;
&lt;br /&gt;
== Qualification Tasks ==&lt;br /&gt;
To be eligible for a Summer of Code project, we ask you to do a small programming task to prove you know the basics. FFmpeg is a large, complicated collection of code and its not easy for beginners. There are some ideas for tasks on the [[Small FFmpeg Tasks]] page.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== 1st Tier Project Proposals ==&lt;br /&gt;
These are proposals with a mentor attached.&lt;br /&gt;
&lt;br /&gt;
Baptiste has also offered to mentor.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Improve the audio resampling/rematrixing/converting code ===&lt;br /&gt;
&lt;br /&gt;
* right now, we're using libswresample to resample/rematrix audio (samplerate / channels) and to resample the audio format (int, float, 16-bit, 32-bit).&lt;br /&gt;
* both interleaved and planar audio sample formats are already supported&lt;br /&gt;
* We need SIMD optimization of popular conversions (float-int16, int16-float), (stereo-mono-5.1) and anything else thats frequently used.&lt;br /&gt;
* We need support for alternate conversion functions (e.g. sample format conversion with or without dithering)&lt;br /&gt;
* fix bugs in current design (none known but there sure are some)&lt;br /&gt;
&lt;br /&gt;
''Mentor: [[User:Michael|Michael Niedermayer]]''&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Libavfilter extension ===&lt;br /&gt;
&lt;br /&gt;
Libavfilter is the FFmpeg filtering library that started as a 2007 SoC [[FFmpeg Summer Of Code#Video Filter API (AKA libavfilter)|project]]. It currently supports audio and video filtering and generation support.&lt;br /&gt;
&lt;br /&gt;
The task would consist into writing or porting audio and video filters and eventually fix/extend libavfilter API and design.&lt;br /&gt;
&lt;br /&gt;
In particular the work may focus on porting MPlayer filters which are currently integrated through the mp wrapper.&lt;br /&gt;
For each port the student should verify that the new filter produces the same output (by comparing the output generated by -vf mp=FILTER and -vf FILTER) and checking that the new integrated filter is not slower.&lt;br /&gt;
&lt;br /&gt;
Prerequisites: good C coding skills, familiarity with git/source code control systems, having some background on DSP and image/sound processing techniques would be a bonus but is not strictly required.&lt;br /&gt;
&lt;br /&gt;
For getting more ideas read also the [[FFmpeg_/_Libav_Summer_Of_Code_2011#Libavfilter_video_work|GSoC 2011 libavfilter video proposal]] and [https://ffmpeg.org/trac/ffmpeg/query?status=new&amp;amp;status=open&amp;amp;status=reopened&amp;amp;component=avfilter&amp;amp;col=id&amp;amp;col=summary&amp;amp;col=status&amp;amp;col=type&amp;amp;col=priority&amp;amp;col=component&amp;amp;col=version&amp;amp;order=priority trac libavfilter tickets].&lt;br /&gt;
&lt;br /&gt;
Qualification task: a port or a new implementation of one or more filters.&lt;br /&gt;
&lt;br /&gt;
''Mentor: Stefano Sabatini - saste on IRC (possibly with a backup mentor)''.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Bayer colorspace formats ===&lt;br /&gt;
&lt;br /&gt;
Several image and video format store pixels using Bayer-pattern colorspaces. Supporting these format would broaded FFmepg's applicability to RAW still and video photography processing. Tasks:&lt;br /&gt;
* Implement bayer transformations in libswscale (plain C)&lt;br /&gt;
* Add bayer formats to the libavutil pixfmt enumeration routines&lt;br /&gt;
* Extend TIFF decoder to support DNG-Bayer format&lt;br /&gt;
* Complete PhotoCINE demuxer to support Bayer format; (or another format of your choosing)&lt;br /&gt;
* SIMD optimisations of the libswscale transformations&lt;br /&gt;
&lt;br /&gt;
Qualification task: TBD&lt;br /&gt;
&lt;br /&gt;
''Mentor: [[User:Suxen_drol|Peter Ross]]''&lt;br /&gt;
&lt;br /&gt;
=== extend image formats support ===&lt;br /&gt;
&lt;br /&gt;
Improve FFmpeg support for image formats, adding missing formats (e.g. XPM) and extending support for the current ones (e.g. animated GIF, GIF compression, fix PNG todos, add support to animated PNG) etc.&lt;br /&gt;
&lt;br /&gt;
Qualification task: TBD (possibly finally fixing and integrating Måns' zlib decoder that has been unmerged since ages? Or just starting with some small part of the task itself, or implementing format autodetection for imagepipe demuxer)&lt;br /&gt;
&lt;br /&gt;
''Mentor: [[User:reimar|Reimar Döffinger]]''&lt;br /&gt;
&lt;br /&gt;
== 2nd Tier (need mentor) Project Proposals ==&lt;br /&gt;
some following proposals might be done by libav, still waiting for dust to settle.&lt;br /&gt;
&lt;br /&gt;
=== H.264 MVC ===&lt;br /&gt;
* Implement an H.264 MVC decoder. This format is used in 3D Blu-Rays.&lt;br /&gt;
&lt;br /&gt;
=== DTS-HD decoder ===&lt;br /&gt;
&lt;br /&gt;
* ETSI released specifcations (http://www.etsi.org/deliver/etsi_ts/102100_102199/102114/01.03.01_60/ts_102114v010301p.pdf). Your job is to complete the existing decoder with the following features.&lt;br /&gt;
&lt;br /&gt;
 (1) Add support for mixed Core + DTS-HD stream structure&lt;br /&gt;
     (DtsCoreFrame+DtsHdFrame+DtsCoreFrame+DtsHdFrame+...), used by Blu-Ray main&lt;br /&gt;
     and commentary tracks.&lt;br /&gt;
 (2) Add support for XXCh extension (6.1 and 7.1 channels).&lt;br /&gt;
 (3) Add support for X96 extension (96khz).&lt;br /&gt;
 (4) Add support for XLL extension (lossless).&lt;br /&gt;
 (5) Add support for a pure DTS-HD stream structure&lt;br /&gt;
     (DtsHdFrame+DtsHdFrame+DtsHdFrame+...), used by Blu-Ray PiP tracks.&lt;br /&gt;
 (6) Add support for XBR extension (extra bitrate).&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== MPEG-4 ALS Roundup ===&lt;br /&gt;
&lt;br /&gt;
This task is to update and enhance the existing ALS decoder as well as integrate&lt;br /&gt;
and enhance the rudimentary encoder found at:&lt;br /&gt;
https://github.com/justinruggles/FFmpeg-alsenc&lt;br /&gt;
&lt;br /&gt;
Possible features are:&lt;br /&gt;
&lt;br /&gt;
* implement rls-lms in the decoder&lt;br /&gt;
* do correct channel layout/sort handling in the decoder&lt;br /&gt;
* update to current master&lt;br /&gt;
* use codec private options&lt;br /&gt;
* implement encode2(), setting pts and duration&lt;br /&gt;
* document options and examples in encoders.texi&lt;br /&gt;
* come up with a good set of encoding tests for FATE&lt;br /&gt;
* implement mcc/channel sort in the encoder&lt;br /&gt;
* implement rls-lms in the encoder&lt;br /&gt;
* implement float support&lt;br /&gt;
&lt;br /&gt;
=== reverse engineer [[TAK]] format ===&lt;br /&gt;
there is some interest in this format. maybe a qual task?&lt;br /&gt;
&lt;br /&gt;
=== support for more subtitle formats ===&lt;br /&gt;
we have libass support now, either a parser (from mplayer) to convert subs into ass or something else.&lt;br /&gt;
* http://mailman.videolan.org/pipermail/vlc-devel/2011-September/081803.html&lt;br /&gt;
* http://ale5000.altervista.org/subtitles.htm&lt;br /&gt;
&lt;br /&gt;
=== mkv ordered chapters / playlist support ===&lt;br /&gt;
get playlist stuff into ffmpeg. playlist is blocking a few things like quicktime edit list and .asx / .pls files.&lt;br /&gt;
&lt;br /&gt;
=== adobe fragmented http in/out ===&lt;br /&gt;
adobe has a new streaming format.&lt;br /&gt;
* info/spec: http://www.adobe.com/products/httpdynamicstreaming/&lt;br /&gt;
&lt;br /&gt;
=== libavfilter 9/10bit support ===&lt;br /&gt;
make filters work with higher bitrate codecs/colorspaces&lt;br /&gt;
&lt;br /&gt;
=== fix copying video between formats ===&lt;br /&gt;
lots of h264 streams in flv, mp4, mkv, mpegts. people wish to remux these into various formats so it works on their hardware (ps3, ipod etc).&lt;br /&gt;
 https://ffmpeg.org/trac/ffmpeg/ticket/796&lt;br /&gt;
 https://ffmpeg.org/trac/ffmpeg/ticket/822&lt;br /&gt;
 https://ffmpeg.org/trac/ffmpeg/ticket/954&lt;br /&gt;
 https://ffmpeg.org/trac/ffmpeg/ticket/976&lt;br /&gt;
&lt;br /&gt;
=== extend paletted format support ===&lt;br /&gt;
&lt;br /&gt;
Cleanup framework for handling better with paletted format, write a posterize filter, add support to libswscale palette output (possibly making use of libavcodec/elbg), add support for reading and saving a palette to a file and apply them to the input video (e.g. by creating ad-hoc filters).&lt;br /&gt;
&lt;br /&gt;
=== g2m4 decoder ===&lt;br /&gt;
lots of videolan users are requesting this codec be supported. there is a binary decoder in mplayer. its possible maxim is already working on this.&lt;/div&gt;</summary>
		<author><name>Reimar</name></author>
	</entry>
	<entry>
		<id>https://wiki.multimedia.cx/index.php?title=FFmpeg_Summer_of_Code_2012&amp;diff=13949</id>
		<title>FFmpeg Summer of Code 2012</title>
		<link rel="alternate" type="text/html" href="https://wiki.multimedia.cx/index.php?title=FFmpeg_Summer_of_Code_2012&amp;diff=13949"/>
		<updated>2012-03-11T08:51:35Z</updated>

		<summary type="html">&lt;p&gt;Reimar: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;:Please detail the deadlines for applying and being a student/mentor.&lt;br /&gt;
&lt;br /&gt;
FFmpeg Summer of Code Proposals and Qualification Tasks.&lt;br /&gt;
&lt;br /&gt;
==Timeline==&lt;br /&gt;
March 12th-16th: Project application evaluation.&lt;br /&gt;
&lt;br /&gt;
=Contacting Devels=&lt;br /&gt;
Find us on irc, server: irc.freenode.net channel: #ffmpeg-devel&lt;br /&gt;
or contact us by subscribing to the mailing list https://lists.ffmpeg.org/mailman/listinfo/ffmpeg-devel/&lt;br /&gt;
&lt;br /&gt;
== Qualification Tasks ==&lt;br /&gt;
To be eligible for a Summer of Code project, we ask you to do a small programming task to prove you know the basics. FFmpeg is a large, complicated collection of code and its not easy for beginners. There are some ideas for tasks on the [[Small FFmpeg Tasks]] page.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== 1st Tier Project Proposals ==&lt;br /&gt;
These are proposals with a mentor attached.&lt;br /&gt;
&lt;br /&gt;
Baptiste has also offered to mentor.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Improve the audio resampling/rematrixing/converting code ===&lt;br /&gt;
&lt;br /&gt;
* right now, we're using libswresample to resample/rematrix audio (samplerate / channels) and to resample the audio format (int, float, 16-bit, 32-bit).&lt;br /&gt;
* both interleaved and planar audio sample formats are already supported&lt;br /&gt;
* We need SIMD optimization of popular conversions (float-int16, int16-float), (stereo-mono-5.1) and anything else thats frequently used.&lt;br /&gt;
* We need support for alternate conversion functions (e.g. sample format conversion with or without dithering)&lt;br /&gt;
* fix bugs in current design (none known but there sure are some)&lt;br /&gt;
&lt;br /&gt;
''Mentor: [[User:Michael|Michael Niedermayer]]''&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Libavfilter extension ===&lt;br /&gt;
&lt;br /&gt;
Libavfilter is the FFmpeg filtering library that started as a 2007 SoC [[FFmpeg Summer Of Code#Video Filter API (AKA libavfilter)|project]]. It currently supports audio and video filtering and generation support.&lt;br /&gt;
&lt;br /&gt;
The task would consist into writing or porting audio and video filters and eventually fix/extend libavfilter API and design.&lt;br /&gt;
&lt;br /&gt;
In particular the work may focus on porting MPlayer filters which are currently integrated through the mp wrapper.&lt;br /&gt;
For each port the student should verify that the new filter produces the same output (by comparing the output generated by -vf mp=FILTER and -vf FILTER) and checking that the new integrated filter is not slower.&lt;br /&gt;
&lt;br /&gt;
Prerequisites: good C coding skills, familiarity with git/source code control systems, having some background on DSP and image/sound processing techniques would be a bonus but is not strictly required.&lt;br /&gt;
&lt;br /&gt;
For getting more ideas read also the [[FFmpeg_/_Libav_Summer_Of_Code_2011#Libavfilter_video_work|GSoC 2011 libavfilter video proposal]] and [https://ffmpeg.org/trac/ffmpeg/query?status=new&amp;amp;status=open&amp;amp;status=reopened&amp;amp;component=avfilter&amp;amp;col=id&amp;amp;col=summary&amp;amp;col=status&amp;amp;col=type&amp;amp;col=priority&amp;amp;col=component&amp;amp;col=version&amp;amp;order=priority trac libavfilter tickets].&lt;br /&gt;
&lt;br /&gt;
Qualification task: a port or a new implementation of one or more filters.&lt;br /&gt;
&lt;br /&gt;
''Mentor: Stefano Sabatini - saste on IRC (possibly with a backup mentor)''.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Bayer colorspace formats ===&lt;br /&gt;
&lt;br /&gt;
Several image and video format store pixels using Bayer-pattern colorspaces. Supporting these format would broaded FFmepg's applicability to RAW still and video photography processing. Tasks:&lt;br /&gt;
* Implement bayer transformations in libswscale (plain C)&lt;br /&gt;
* Add bayer formats to the libavutil pixfmt enumeration routines&lt;br /&gt;
* Extend TIFF decoder to support DNG-Bayer format&lt;br /&gt;
* Complete PhotoCINE demuxer to support Bayer format; (or another format of your choosing)&lt;br /&gt;
* SIMD optimisations of the libswscale transformations&lt;br /&gt;
&lt;br /&gt;
Qualification task: TBD&lt;br /&gt;
&lt;br /&gt;
''Mentor: [[User:Suxen_drol|Peter Ross]]''&lt;br /&gt;
&lt;br /&gt;
=== extend image formats support ===&lt;br /&gt;
&lt;br /&gt;
Improve FFmpeg support for image formats, adding missing formats (e.g. XPM) and extending support for the current ones (e.g. animated GIF, GIF compression, fix PNG todos, add support to animated PNG) etc.&lt;br /&gt;
&lt;br /&gt;
Qualification task: TBD (possibly finally fixing and integrating Måns' zlib decoder that has been unmerged since ages? Or just starting with some small part of the task itself)&lt;br /&gt;
&lt;br /&gt;
''Mentor: [[User:reimar|Reimar Döffinger]]''&lt;br /&gt;
&lt;br /&gt;
== 2nd Tier (need mentor) Project Proposals ==&lt;br /&gt;
some following proposals might be done by libav, still waiting for dust to settle.&lt;br /&gt;
&lt;br /&gt;
=== H.264 MVC ===&lt;br /&gt;
* Implement an H.264 MVC decoder. This format is used in 3D Blu-Rays.&lt;br /&gt;
&lt;br /&gt;
=== DTS-HD decoder ===&lt;br /&gt;
&lt;br /&gt;
* ETSI released specifcations (http://www.etsi.org/deliver/etsi_ts/102100_102199/102114/01.03.01_60/ts_102114v010301p.pdf). Your job is to complete the existing decoder with the following features.&lt;br /&gt;
&lt;br /&gt;
 (1) Add support for mixed Core + DTS-HD stream structure&lt;br /&gt;
     (DtsCoreFrame+DtsHdFrame+DtsCoreFrame+DtsHdFrame+...), used by Blu-Ray main&lt;br /&gt;
     and commentary tracks.&lt;br /&gt;
 (2) Add support for XXCh extension (6.1 and 7.1 channels).&lt;br /&gt;
 (3) Add support for X96 extension (96khz).&lt;br /&gt;
 (4) Add support for XLL extension (lossless).&lt;br /&gt;
 (5) Add support for a pure DTS-HD stream structure&lt;br /&gt;
     (DtsHdFrame+DtsHdFrame+DtsHdFrame+...), used by Blu-Ray PiP tracks.&lt;br /&gt;
 (6) Add support for XBR extension (extra bitrate).&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== MPEG-4 ALS Roundup ===&lt;br /&gt;
&lt;br /&gt;
This task is to update and enhance the existing ALS decoder as well as integrate&lt;br /&gt;
and enhance the rudimentary encoder found at:&lt;br /&gt;
https://github.com/justinruggles/FFmpeg-alsenc&lt;br /&gt;
&lt;br /&gt;
Possible features are:&lt;br /&gt;
&lt;br /&gt;
* implement rls-lms in the decoder&lt;br /&gt;
* do correct channel layout/sort handling in the decoder&lt;br /&gt;
* update to current master&lt;br /&gt;
* use codec private options&lt;br /&gt;
* implement encode2(), setting pts and duration&lt;br /&gt;
* document options and examples in encoders.texi&lt;br /&gt;
* come up with a good set of encoding tests for FATE&lt;br /&gt;
* implement mcc/channel sort in the encoder&lt;br /&gt;
* implement rls-lms in the encoder&lt;br /&gt;
* implement float support&lt;br /&gt;
&lt;br /&gt;
=== reverse engineer [[TAK]] format ===&lt;br /&gt;
there is some interest in this format. maybe a qual task?&lt;br /&gt;
&lt;br /&gt;
=== support for more subtitle formats ===&lt;br /&gt;
we have libass support now, either a parser (from mplayer) to convert subs into ass or something else.&lt;br /&gt;
* http://mailman.videolan.org/pipermail/vlc-devel/2011-September/081803.html&lt;br /&gt;
* http://ale5000.altervista.org/subtitles.htm&lt;br /&gt;
&lt;br /&gt;
=== mkv ordered chapters / playlist support ===&lt;br /&gt;
get playlist stuff into ffmpeg. playlist is blocking a few things like quicktime edit list and .asx / .pls files.&lt;br /&gt;
&lt;br /&gt;
=== adobe fragmented http in/out ===&lt;br /&gt;
adobe has a new streaming format.&lt;br /&gt;
* info/spec: http://www.adobe.com/products/httpdynamicstreaming/&lt;br /&gt;
&lt;br /&gt;
=== libavfilter 9/10bit support ===&lt;br /&gt;
make filters work with higher bitrate codecs/colorspaces&lt;br /&gt;
&lt;br /&gt;
=== fix copying video between formats ===&lt;br /&gt;
lots of h264 streams in flv, mp4, mkv, mpegts. people wish to remux these into various formats so it works on their hardware (ps3, ipod etc).&lt;br /&gt;
 https://ffmpeg.org/trac/ffmpeg/ticket/796&lt;br /&gt;
 https://ffmpeg.org/trac/ffmpeg/ticket/822&lt;br /&gt;
 https://ffmpeg.org/trac/ffmpeg/ticket/954&lt;br /&gt;
 https://ffmpeg.org/trac/ffmpeg/ticket/976&lt;br /&gt;
&lt;br /&gt;
=== extend paletted format support ===&lt;br /&gt;
&lt;br /&gt;
Cleanup framework for handling better with paletted format, write a posterize filter, add support to libswscale palette output (possibly making use of libavcodec/elbg), add support for reading and saving a palette to a file and apply them to the input video (e.g. by creating ad-hoc filters).&lt;br /&gt;
&lt;br /&gt;
=== g2m4 decoder ===&lt;br /&gt;
lots of videolan users are requesting this codec be supported. there is a binary decoder in mplayer. its possible maxim is already working on this.&lt;/div&gt;</summary>
		<author><name>Reimar</name></author>
	</entry>
	<entry>
		<id>https://wiki.multimedia.cx/index.php?title=PMP&amp;diff=13118</id>
		<title>PMP</title>
		<link rel="alternate" type="text/html" href="https://wiki.multimedia.cx/index.php?title=PMP&amp;diff=13118"/>
		<updated>2010-11-04T20:41:40Z</updated>

		<summary type="html">&lt;p&gt;Reimar: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;PMP format&lt;br /&gt;
&lt;br /&gt;
All values are stored as little-endian&lt;br /&gt;
&lt;br /&gt;
header:&lt;br /&gt;
  0-3   &amp;quot;pmpm&amp;quot;&lt;br /&gt;
  4-7   1 (version?)&lt;br /&gt;
  8-b  video format (0 = MPEG-4 ASP, 1 = H.264)&lt;br /&gt;
  d-f  number of packets (== number of frames?)&lt;br /&gt;
 10-13 video width&lt;br /&gt;
 14-17 video height&lt;br /&gt;
 18-1b time base num&lt;br /&gt;
 1c-1f time base den&lt;br /&gt;
 20-23 audio format (0 = MP3, 1 = AAC)&lt;br /&gt;
 24-27 number of audio streams (all the same format)&lt;br /&gt;
 28-2b ??&lt;br /&gt;
 2c-2f ??&lt;br /&gt;
 30-33 sample rate&lt;br /&gt;
 34-37 channels - 1?&lt;br /&gt;
 38-   list of packet sizes, 4 bytes each, lowest bit is keyframe flag&lt;br /&gt;
 following: data packets&lt;br /&gt;
&lt;br /&gt;
data packet:&lt;br /&gt;
 0   number of audio packets (per audio stream)&lt;br /&gt;
 1-4 ?&lt;br /&gt;
 5-8 ?&lt;br /&gt;
 9-c length of video data&lt;br /&gt;
 audio packet sizes, 4 bytes each&lt;br /&gt;
 video data&lt;br /&gt;
 audio data, AAC packets lack 7 bytes of AAC header&lt;/div&gt;</summary>
		<author><name>Reimar</name></author>
	</entry>
	<entry>
		<id>https://wiki.multimedia.cx/index.php?title=PMP&amp;diff=13117</id>
		<title>PMP</title>
		<link rel="alternate" type="text/html" href="https://wiki.multimedia.cx/index.php?title=PMP&amp;diff=13117"/>
		<updated>2010-11-04T20:40:59Z</updated>

		<summary type="html">&lt;p&gt;Reimar: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;PMP format&lt;br /&gt;
&lt;br /&gt;
All values are stored as little-endian&lt;br /&gt;
&lt;br /&gt;
header:&lt;br /&gt;
 0-3   &amp;quot;pmpm&amp;quot;&lt;br /&gt;
 4-7   1 (version?)&lt;br /&gt;
 8-b  video format (0 = MPEG-4 ASP, 1 = H.264)&lt;br /&gt;
 d-f  number of packets (== number of frames?)&lt;br /&gt;
10-13 video width&lt;br /&gt;
14-17 video height&lt;br /&gt;
18-1b time base num&lt;br /&gt;
1c-1f time base den&lt;br /&gt;
20-23 audio format (0 = MP3, 1 = AAC)&lt;br /&gt;
24-27 number of audio streams (all the same format)&lt;br /&gt;
28-2b ??&lt;br /&gt;
2c-2f ??&lt;br /&gt;
30-33 sample rate&lt;br /&gt;
34-37 channels - 1?&lt;br /&gt;
38-   list of packet sizes, 4 bytes each, lowest bit is keyframe flag&lt;br /&gt;
following: data packets&lt;br /&gt;
&lt;br /&gt;
data packet:&lt;br /&gt;
0   number of audio packets (per audio stream)&lt;br /&gt;
1-4 ?&lt;br /&gt;
5-8 ?&lt;br /&gt;
9-c length of video data&lt;br /&gt;
audio packet sizes, 4 bytes each&lt;br /&gt;
video data&lt;br /&gt;
audio data, AAC packets lack 7 bytes of AAC header&lt;/div&gt;</summary>
		<author><name>Reimar</name></author>
	</entry>
	<entry>
		<id>https://wiki.multimedia.cx/index.php?title=GSM_06.10&amp;diff=12793</id>
		<title>GSM 06.10</title>
		<link rel="alternate" type="text/html" href="https://wiki.multimedia.cx/index.php?title=GSM_06.10&amp;diff=12793"/>
		<updated>2010-07-03T22:15:25Z</updated>

		<summary type="html">&lt;p&gt;Reimar: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;* FOURCCs: agsm&lt;br /&gt;
* Website: http://kbs.cs.tu-berlin.de/~jutta/toast.html&lt;br /&gt;
* Specification: http://www.3gpp.org/ftp/Specs/2010-06/R1999/06_series/0610-820.zip&lt;br /&gt;
&lt;br /&gt;
GSM 06.10 is a GSM [[vocoder]] standard that also occurs in some multimedia files. Also see [[Microsoft GSM]].&lt;br /&gt;
&lt;br /&gt;
[[Category:Vocoders]]&lt;br /&gt;
[[Category:Audio Codecs]]&lt;br /&gt;
[[Category: Formats missing in FFmpeg]]&lt;br /&gt;
&lt;br /&gt;
Format:&lt;br /&gt;
Each frame consists of 33 bytes, in a little-endian bitstream.&lt;br /&gt;
&lt;br /&gt;
Bitstream layout:&lt;br /&gt;
&lt;br /&gt;
4 bits: magic 0xd,&lt;br /&gt;
2 x 6 bits, 2 x 5 bits, 2 x 4 bits, 2 x 3 bits&lt;br /&gt;
&lt;br /&gt;
4 x coefficients (&lt;br /&gt;
7 bits,&lt;br /&gt;
2 bits,&lt;br /&gt;
2 bits,&lt;br /&gt;
6 bits,&lt;br /&gt;
13 x 3 bits&lt;br /&gt;
)&lt;/div&gt;</summary>
		<author><name>Reimar</name></author>
	</entry>
	<entry>
		<id>https://wiki.multimedia.cx/index.php?title=GSM_06.10&amp;diff=12792</id>
		<title>GSM 06.10</title>
		<link rel="alternate" type="text/html" href="https://wiki.multimedia.cx/index.php?title=GSM_06.10&amp;diff=12792"/>
		<updated>2010-07-03T21:07:10Z</updated>

		<summary type="html">&lt;p&gt;Reimar: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;* FOURCCs: agsm&lt;br /&gt;
* Website: http://kbs.cs.tu-berlin.de/~jutta/toast.html&lt;br /&gt;
* Specification: http://www.3gpp.org/ftp/Specs/2010-06/R1999/06_series/0610-820.zip&lt;br /&gt;
&lt;br /&gt;
GSM 06.10 is a GSM [[vocoder]] standard that also occurs in some multimedia files. Also see [[Microsoft GSM]].&lt;br /&gt;
&lt;br /&gt;
[[Category:Vocoders]]&lt;br /&gt;
[[Category:Audio Codecs]]&lt;br /&gt;
[[Category: Formats missing in FFmpeg]]&lt;br /&gt;
&lt;br /&gt;
Format:&lt;br /&gt;
Each frame consists of 33 bytes, in a bigendian bitstream.&lt;br /&gt;
&lt;br /&gt;
Bitstream layout:&lt;br /&gt;
&lt;br /&gt;
4 bits: magic 0xd,&lt;br /&gt;
2 x 6 bits, 2 x 5 bits, 2 x 4 bits, 2 x 3 bits&lt;br /&gt;
&lt;br /&gt;
4 x coefficients (&lt;br /&gt;
7 bits,&lt;br /&gt;
2 bits,&lt;br /&gt;
2 bits,&lt;br /&gt;
6 bits,&lt;br /&gt;
13 x 3 bits&lt;br /&gt;
)&lt;/div&gt;</summary>
		<author><name>Reimar</name></author>
	</entry>
	<entry>
		<id>https://wiki.multimedia.cx/index.php?title=GSM_06.10&amp;diff=12781</id>
		<title>GSM 06.10</title>
		<link rel="alternate" type="text/html" href="https://wiki.multimedia.cx/index.php?title=GSM_06.10&amp;diff=12781"/>
		<updated>2010-07-03T11:54:10Z</updated>

		<summary type="html">&lt;p&gt;Reimar: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;* FOURCCs: agsm&lt;br /&gt;
* Website: http://kbs.cs.tu-berlin.de/~jutta/toast.html&lt;br /&gt;
* Specification: http://www.3gpp.org/ftp/Specs/2010-06/R1999/06_series/0610-820.zip&lt;br /&gt;
&lt;br /&gt;
GSM 06.10 is a GSM [[vocoder]] standard that also occurs in some multimedia files. Also see [[Microsoft GSM]].&lt;br /&gt;
&lt;br /&gt;
[[Category:Vocoders]]&lt;br /&gt;
[[Category:Audio Codecs]]&lt;br /&gt;
[[Category: Formats missing in FFmpeg]]&lt;br /&gt;
&lt;br /&gt;
Format:&lt;br /&gt;
Each frame consists of 33 bytes, in a bigendian bitstream.&lt;br /&gt;
&lt;br /&gt;
Bitstream layout:&lt;br /&gt;
&lt;br /&gt;
4 bits: magic 0xd&lt;br /&gt;
8 x 6 bits&lt;br /&gt;
&lt;br /&gt;
4 x coefficients (&lt;br /&gt;
7 bits&lt;br /&gt;
2 bits&lt;br /&gt;
2 bits&lt;br /&gt;
6 bits&lt;br /&gt;
13 x 3 bits&lt;br /&gt;
)&lt;/div&gt;</summary>
		<author><name>Reimar</name></author>
	</entry>
	<entry>
		<id>https://wiki.multimedia.cx/index.php?title=Phantom_Cine&amp;diff=12689</id>
		<title>Phantom Cine</title>
		<link rel="alternate" type="text/html" href="https://wiki.multimedia.cx/index.php?title=Phantom_Cine&amp;diff=12689"/>
		<updated>2010-06-13T12:12:50Z</updated>

		<summary type="html">&lt;p&gt;Reimar: Clarify: current description is not based on specification&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Extensions: cine&lt;br /&gt;
* Company: [[Vision Research Inc.]]&lt;br /&gt;
* Samples: [http://samples.mplayerhq.hu/V-codecs/Phantom_Cine http://samples.mplayerhq.hu/V-codecs/Phantom_Cine]&lt;br /&gt;
&lt;br /&gt;
The Phantom Cine format&lt;br /&gt;
&lt;br /&gt;
Note: below information is based on looking at a sample file, please verify against the specification linked at the end.&lt;br /&gt;
&lt;br /&gt;
This is a raw uncompressed format.&lt;br /&gt;
&lt;br /&gt;
It contains a lot of metadata, and index and raw uncompressed frames, presumably in Bayer format.&lt;br /&gt;
Know samples store each Bayer sample with 16 bits, where only the lowest 14 bits are relevant (the value 16 can be determined from the BitmapInfoHeader, the 14 bits are somewhere in the file as well, as can be seen from the xml-representation of the metadata (the RealBPP entry) - all data in the xml file exists in the .cine file as well). Making the lowest instead of the highest 14 bits the relevant ones is a rather unfortunate choice since it makes conversion to standard formats more difficult.&lt;br /&gt;
&lt;br /&gt;
Note: some of the fixed offsets below might not always be at that offset, though they are for the known samples and there are no length fields that would allow calculating them in a sensible way.&lt;br /&gt;
&lt;br /&gt;
All values are stored in little-endian format.&lt;br /&gt;
&lt;br /&gt;
The files start with the bytes &amp;quot;CI&amp;quot;, followed by two bytes possibly giving a size or index (all files so far have the value 2c 00).&lt;br /&gt;
&lt;br /&gt;
This &amp;quot;main header&amp;quot; contains this data:&lt;br /&gt;
  2 bytes &amp;quot;CI&amp;quot;&lt;br /&gt;
  2 bytes length&lt;br /&gt;
  2 bytes &amp;quot;compression type&amp;quot; (always 2) ??&lt;br /&gt;
  2 bytes version (always 1)&lt;br /&gt;
  4 bytes number of first frame (usually negative)&lt;br /&gt;
  4 bytes number of frames in files&lt;br /&gt;
  4 bytes number of first frame (repeated?)&lt;br /&gt;
  4 bytes number of frames in files (repeated?)&lt;br /&gt;
  4 bytes offset of BitampInfoHeader&lt;br /&gt;
  4 bytes offset of ???&lt;br /&gt;
  4 bytes offset of file index&lt;br /&gt;
  8 bytes recording time stamp (?)&lt;br /&gt;
&lt;br /&gt;
At offset 0x2c there is a BitmapInfoHeader structure. The first 4 bytes give its length (0x28).&lt;br /&gt;
&lt;br /&gt;
The next sections starts with &amp;quot;ST&amp;quot; at offset 0xe0, again followed by two bytes giving a size or index (so far always 6c 16).&lt;br /&gt;
&lt;br /&gt;
Variable length part seems to start at offset 0x16c0. Note that this is the sum of the possible length files from CI, ST and BitmapInfoHeader (i.e. 0x2c + 0x28 + 0x166c = 0x16c0).&lt;br /&gt;
&lt;br /&gt;
It consists of 3 parts:&lt;br /&gt;
&lt;br /&gt;
1) timestamps&lt;br /&gt;
&lt;br /&gt;
2) exposure values&lt;br /&gt;
&lt;br /&gt;
3) index&lt;br /&gt;
&lt;br /&gt;
The timestamp and exposure values parts start with a 4-byte length field allowing to skip them.&lt;br /&gt;
&lt;br /&gt;
The index contains a 8-byte offset value for each frame. It does not have a length field, the number of entries is determined from the number of frames in the main header.&lt;br /&gt;
&lt;br /&gt;
At each offset indicated by the index, there is first a header. The first 4 bytes of the header give the length of the header. Only the value 8 has a known meaning, in which case the next 4 bytes give the length of the frame data.&lt;br /&gt;
&lt;br /&gt;
== External Links ==&lt;br /&gt;
* [http://www.visionresearch.com/devzonedownloads/cine640.pdf http://www.visionresearch.com/devzonedownloads/cine640.pdf] The CINE File Format, Vision Research Inc., 2007&lt;br /&gt;
&lt;br /&gt;
[[Category:Container Formats]]&lt;/div&gt;</summary>
		<author><name>Reimar</name></author>
	</entry>
	<entry>
		<id>https://wiki.multimedia.cx/index.php?title=Phantom_Cine&amp;diff=12688</id>
		<title>Phantom Cine</title>
		<link rel="alternate" type="text/html" href="https://wiki.multimedia.cx/index.php?title=Phantom_Cine&amp;diff=12688"/>
		<updated>2010-06-13T12:11:24Z</updated>

		<summary type="html">&lt;p&gt;Reimar: typo fix&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Extensions: cine&lt;br /&gt;
* Company: [[Vision Research Inc.]]&lt;br /&gt;
* Samples: [http://samples.mplayerhq.hu/V-codecs/Phantom_Cine http://samples.mplayerhq.hu/V-codecs/Phantom_Cine]&lt;br /&gt;
&lt;br /&gt;
The Phantom Cine format&lt;br /&gt;
&lt;br /&gt;
This is a raw uncompressed format.&lt;br /&gt;
&lt;br /&gt;
It contains a lot of metadata, and index and raw uncompressed frames, presumably in Bayer format.&lt;br /&gt;
Know samples store each Bayer sample with 16 bits, where only the lowest 14 bits are relevant (the value 16 can be determined from the BitmapInfoHeader, the 14 bits are somewhere in the file as well, as can be seen from the xml-representation of the metadata (the RealBPP entry) - all data in the xml file exists in the .cine file as well). Making the lowest instead of the highest 14 bits the relevant ones is a rather unfortunate choice since it makes conversion to standard formats more difficult.&lt;br /&gt;
&lt;br /&gt;
Note: some of the fixed offsets below might not always be at that offset, though they are for the known samples and there are no length fields that would allow calculating them in a sensible way.&lt;br /&gt;
&lt;br /&gt;
All values are stored in little-endian format.&lt;br /&gt;
&lt;br /&gt;
The files start with the bytes &amp;quot;CI&amp;quot;, followed by two bytes possibly giving a size or index (all files so far have the value 2c 00).&lt;br /&gt;
&lt;br /&gt;
This &amp;quot;main header&amp;quot; contains this data:&lt;br /&gt;
  2 bytes &amp;quot;CI&amp;quot;&lt;br /&gt;
  2 bytes length&lt;br /&gt;
  2 bytes &amp;quot;compression type&amp;quot; (always 2) ??&lt;br /&gt;
  2 bytes version (always 1)&lt;br /&gt;
  4 bytes number of first frame (usually negative)&lt;br /&gt;
  4 bytes number of frames in files&lt;br /&gt;
  4 bytes number of first frame (repeated?)&lt;br /&gt;
  4 bytes number of frames in files (repeated?)&lt;br /&gt;
  4 bytes offset of BitampInfoHeader&lt;br /&gt;
  4 bytes offset of ???&lt;br /&gt;
  4 bytes offset of file index&lt;br /&gt;
  8 bytes recording time stamp (?)&lt;br /&gt;
&lt;br /&gt;
At offset 0x2c there is a BitmapInfoHeader structure. The first 4 bytes give its length (0x28).&lt;br /&gt;
&lt;br /&gt;
The next sections starts with &amp;quot;ST&amp;quot; at offset 0xe0, again followed by two bytes giving a size or index (so far always 6c 16).&lt;br /&gt;
&lt;br /&gt;
Variable length part seems to start at offset 0x16c0. Note that this is the sum of the possible length files from CI, ST and BitmapInfoHeader (i.e. 0x2c + 0x28 + 0x166c = 0x16c0).&lt;br /&gt;
&lt;br /&gt;
It consists of 3 parts:&lt;br /&gt;
&lt;br /&gt;
1) timestamps&lt;br /&gt;
&lt;br /&gt;
2) exposure values&lt;br /&gt;
&lt;br /&gt;
3) index&lt;br /&gt;
&lt;br /&gt;
The timestamp and exposure values parts start with a 4-byte length field allowing to skip them.&lt;br /&gt;
&lt;br /&gt;
The index contains a 8-byte offset value for each frame. It does not have a length field, the number of entries is determined from the number of frames in the main header.&lt;br /&gt;
&lt;br /&gt;
At each offset indicated by the index, there is first a header. The first 4 bytes of the header give the length of the header. Only the value 8 has a known meaning, in which case the next 4 bytes give the length of the frame data.&lt;br /&gt;
&lt;br /&gt;
== External Links ==&lt;br /&gt;
* [http://www.visionresearch.com/devzonedownloads/cine640.pdf http://www.visionresearch.com/devzonedownloads/cine640.pdf] The CINE File Format, Vision Research Inc., 2007&lt;br /&gt;
&lt;br /&gt;
[[Category:Container Formats]]&lt;/div&gt;</summary>
		<author><name>Reimar</name></author>
	</entry>
	<entry>
		<id>https://wiki.multimedia.cx/index.php?title=Small_FFmpeg_Tasks&amp;diff=12463</id>
		<title>Small FFmpeg Tasks</title>
		<link rel="alternate" type="text/html" href="https://wiki.multimedia.cx/index.php?title=Small_FFmpeg_Tasks&amp;diff=12463"/>
		<updated>2010-03-28T12:13:14Z</updated>

		<summary type="html">&lt;p&gt;Reimar: /* implement some colorspace fourcc/codecs */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This page contains ideas for small, relatively simple tasks for the [[FFmpeg]] project. People who might be interested in trying one of these tasks:&lt;br /&gt;
* Someone who wants to contribute to FFmpeg and needs to find a well-defined task to start with&lt;br /&gt;
* Someone who wishes to qualify for one of FFmpeg's coveted [[FFmpeg Summer Of Code|Summer of Code]] project slots&lt;br /&gt;
* An existing FFmpeg developer who has been away from the project for a while and needs a smaller task as motivation for re-learning the codebase&lt;br /&gt;
&lt;br /&gt;
For other tasks of varying difficulty, see the [[Interesting Patches]] page.&lt;br /&gt;
&lt;br /&gt;
'''If you would like to work on one of these tasks''', please take these steps:&lt;br /&gt;
* Subscribe to the [https://lists.mplayerhq.hu/mailman/listinfo/ffmpeg-devel FFmpeg development mailing list] and indicate your interest&lt;br /&gt;
* Ask [[User:Multimedia Mike|Multimedia Mike]] for a Wiki account so you can claim your task on this Wiki&lt;br /&gt;
&lt;br /&gt;
'''If you would like to add to this list''', please be prepared to explain some useful details about the task. Excessively vague tasks with no supporting details will be ruthlessly deleted.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Finish up a previous incomplete SoC project ===&lt;br /&gt;
&lt;br /&gt;
Several SoC projects from previous years have not yet made it into FFmpeg. Taking any of them and finishing them up to the point that they can be included should make for a good qualification task. Check out the [[FFmpeg Summer Of Code]] overview page and look for the unfinished projects, like AMR-NB, Dirac, TS muxer, JPEG 2000.&lt;br /&gt;
&lt;br /&gt;
=== Generic Colorspace system ===&lt;br /&gt;
This task involves adding support more than 8 bits per component (Y on 10 bits, U on 10 bits, V on 10 bits for example)&lt;br /&gt;
and generic simple conversion to other colorspaces.&lt;br /&gt;
&lt;br /&gt;
''Does this have to do with revising FFmpeg's infrastructure? If so, then it doesn't feel like a qualification task. If it's something simpler, then the vague description does not convey that simplicity. Please expound.'' --[[User:Multimedia Mike|Multimedia Mike]] 12:56, 25 February 2008 (EST)&lt;br /&gt;
&lt;br /&gt;
''I don't think so, extending PixFmt to extended structure with finegrained description like depth, range values, colorspace, sample period, and write generic simple conversion from all formats to all others, like suggested by Michael on the mailing list. Conversion routine can be a good qualification task for video encoders/decoders. What do you think ?&lt;br /&gt;
--[[User:Bcoudurier|Baptiste Coudurier]] 00:30, 29 February 2008 (EST)&lt;br /&gt;
&lt;br /&gt;
''* Adding the [[YCoCg]] colorspace (with different sized planes) for RGB sourced pictures would be nice too. [[User:Elte|Elte]] 07:15, 16 March 2009 (EDT)&lt;br /&gt;
&lt;br /&gt;
=== Make the SoC dts encoder multichannel capable ===&lt;br /&gt;
Here is a skeleton for a dts encoder http://svn.mplayerhq.hu/soc/dcaenc/, currently it can only encode stereo streams.&lt;br /&gt;
The task is to extend it to support 5.1 channels also.&lt;br /&gt;
&lt;br /&gt;
Specs and info can be found here:&lt;br /&gt;
http://wiki.multimedia.cx/index.php?title=DTS&lt;br /&gt;
&lt;br /&gt;
=== GIF LZW Encoder and extend Encoder and Decoder to support Animated GIFs ===&lt;br /&gt;
&lt;br /&gt;
Lzw encoder is already used for TIFF, it must be extended to support GIF flavor.&lt;br /&gt;
&lt;br /&gt;
=== Implement a Vivo demuxer for FFmpeg ===&lt;br /&gt;
Implement an FFmpeg demuxer for the [[Vivo]] file format. The best reference for understanding the format would be MPlayer's [http://svn.mplayerhq.hu/mplayer/trunk/libmpdemux/demux_viv.c?view=markup existing .viv demuxer].&lt;br /&gt;
&lt;br /&gt;
This task corresponds to issue 99: http://roundup.ffmpeg.org/roundup/ffmpeg/issue99&lt;br /&gt;
&lt;br /&gt;
''I am ready to help out with understanding MPlayer's demuxer, esp. MPlayer API stuff if necessary.&lt;br /&gt;
--[[User:Reimar|Reimar]] 15:46, 1 March 2008 (EST)&lt;br /&gt;
&lt;br /&gt;
=== Port missing demuxers from MPlayer to FFmpeg ===&lt;br /&gt;
MPlayer supports a few container formats in libmpdemux that are not yet present in libavformat. Porting them over and gettting them relicensed as LGPL or reimplementing them from scratch should make reasonable small tasks.&lt;br /&gt;
&lt;br /&gt;
# TiVo -- ''Jai Menon is working on this''&lt;br /&gt;
# VIVO -- ''Daniel Verkamp has a patch for this''&lt;br /&gt;
# SL support for MPEG-TS (anyone got samples?)&lt;br /&gt;
# MNG&lt;br /&gt;
&lt;br /&gt;
=== Optimal Huffman tables for (M)JPEG ===&lt;br /&gt;
This task is outlined at http://guru.multimedia.cx/small-tasks-for-ffmpeg/ and is tracked in the issue tracker: http://roundup.ffmpeg.org/roundup/ffmpeg/issue267&lt;br /&gt;
&lt;br /&gt;
=== YOP Playback System ===&lt;br /&gt;
This task is to implement an FFmpeg playback subsystem for [[Psygnosis YOP]] files. This will entail writing a new file demuxer and video decoder, both of which are trivial by FFmpeg standards. [[Psygnosis YOP|The Psygnosis YOP page]] contains the specs necessary to complete this task and points to downloadable samples.&lt;br /&gt;
:''Patch pending on -devel&lt;br /&gt;
&lt;br /&gt;
=== M95 Playback System ===&lt;br /&gt;
This task is to implement an FFmpeg playback subsystem for [[M95]] files. This will entail writing a new file demuxer and video decoder (the audio is already uncompressed), both of which are trivial by FFmpeg standards. [[M95|The M95 page]] contains the specs necessary to complete this task and points to downloadable samples.&lt;br /&gt;
&lt;br /&gt;
=== BRP Playback System ===&lt;br /&gt;
This task is to implement an FFmpeg playback subsystem for [[BRP]] files. This will entail writing a new file demuxer as well as a video decoder that can handle at least 2 variations of format data. Further, write an audio decoder for the custom DPCM format in the file. All of these tasks are considered trivial by FFmpeg standards. [[BRP|The BRP page]] contains the specs necessary to complete this task and points to downloadable samples for both known variations.&lt;br /&gt;
&lt;br /&gt;
=== 16-bit Interplay Video Decoder ===&lt;br /&gt;
FFmpeg already supports [[Interplay MVE]] files with [[Interplay Video|8-bit video data]] inside. This task involves supporting 16-bit video data. The video encoding format is mostly the same but the pixel size is twice as large. Engage the ffmpeg-devel list to discuss how best to approach this task.&lt;br /&gt;
&lt;br /&gt;
=== 16-bit VQA Video Decoder ===&lt;br /&gt;
FFmpeg already supports Westwood [[VQA]] files. However, there are 3 variations of its custom video codec. The first 2 are supported in FFmpeg. This task involves implementing support for the 3rd variation. Visit the VQA samples repository: http://samples.mplayerhq.hu/game-formats/vqa/ -- The files in the directories Tiberian Sun VQAs/, bladerunner/, and dune2000/ use the 3rd variation of this codec. The [[VQA|VQA page]] should link to all the details you need to support this format.&lt;br /&gt;
&lt;br /&gt;
Discussion/patch: ([http://lists.mplayerhq.hu/pipermail/ffmpeg-devel/2009-March/065348.html reference])&lt;br /&gt;
&lt;br /&gt;
=== HNM4 Playback System ===&lt;br /&gt;
This task is to implement an FFmpeg playback subsystem for [[HNM4]] variant of the [[HNM]] format. This will entail writing a new file demuxer and video decoder, both of which are trivial by FFmpeg standards. [[HNM4|The HNM4 page]] contains the specs necessary to complete this task and links to downloadable samples.&lt;br /&gt;
&lt;br /&gt;
=== Apple RPZA encoder ===&lt;br /&gt;
A patch was once sent to the ffmpeg-devel mailing list to include an encoder for the [[Apple RPZA]] video codec. That code can be found on the &amp;quot;[[Interesting Patches]]&amp;quot; page. This qualification task involves applying that patch so that it can compile with current FFmpeg SVN code and then cleaning it up per the standards of the project. Engage the mailing list to learn more about what to do.&lt;br /&gt;
:''Claimed by Jai Menon''&lt;br /&gt;
&lt;br /&gt;
=== QuickTime Edit List Support ===&lt;br /&gt;
Implement edit list support in FFmpeg's QuickTime demuxer (libavformat/mov.c). This involves parsing the 'elst' atom in a QuickTime file. For a demonstration of how this is a problem, download the file menace00.mov from http://samples.mplayerhq.hu/mov/editlist/ and play it with ffplay or transcode it with ffmpeg. Notice that the audio and video are ever so slightly out of sync. Proper edit list support will solve that. Other samples in that directory also presumably exhibit edit list-related bugs. The [http://xine.cvs.sourceforge.net/xine/xine-lib/src/demuxers/demux_qt.c?view=markup Xine demuxer] has support for this, it might be useful for hints.&lt;br /&gt;
&lt;br /&gt;
(patch was submitted to ffmpeg-devel , around 14 March 2009) &lt;br /&gt;
&lt;br /&gt;
=== Implement the Flash Screen Video codec version 2 ===&lt;br /&gt;
FFmpeg is missing both a decoder and an encoder. Would be nice to have that.&lt;br /&gt;
&lt;br /&gt;
:''Daniel Verkamp is working on this''&lt;br /&gt;
&lt;br /&gt;
=== Add wma fixed point decoder back into libavcodec ===&lt;br /&gt;
http://svn.rockbox.org/viewvc.cgi/trunk/apps/codecs/libwma/&lt;br /&gt;
Rockbox's fixed-point WMA decoder was adapted from the decoder in libavcodec.&lt;br /&gt;
&lt;br /&gt;
=== RealAudio 14.4 encoder ===&lt;br /&gt;
FFmpeg contains a decoder for [[RealAudio 14.4]], a farily simple integer CELP codec.  Write an encoder.  This would be a good qualification task for anyone interested in working on AMR, Speex, or sipr.&lt;br /&gt;
&lt;br /&gt;
=== VC1 timestamps in m2ts ===&lt;br /&gt;
&lt;br /&gt;
Codec copy of VC1 from m2ts currently doesn't work. Either extend the VC1 parser to output/fix timestamps, or fix the timestamps from m2ts demuxing.&lt;br /&gt;
&lt;br /&gt;
=== FLIC work ===&lt;br /&gt;
&lt;br /&gt;
Revise the [[Flic Video]] decoder at libavcodec/flicvideo.c to support video transported in AVI or MOV files while making sure that data coming from the usual FLI files still works. 'AFLC' and 'flic' FourCC samples are linked from the [[Flic Video]] page.&lt;br /&gt;
&lt;br /&gt;
=== CJPG format ===&lt;br /&gt;
&lt;br /&gt;
Extend FFmpeg's MJPEG decoder to handle the different frames/packing of CJPG. Samples at: http://roundup.ffmpeg.org/roundup/ffmpeg/issue777&lt;br /&gt;
&lt;br /&gt;
=== flip flag for upside-down codecs ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;about the flip, a patch that decodes images fliped when&lt;br /&gt;
codec_tag == ff_get_fourcc(&amp;quot;GEOX&amp;quot;) is welcome.&lt;br /&gt;
its a metter of 2lines manipulating data/linesize of imgages after&lt;br /&gt;
get_buffer() or something similar&lt;br /&gt;
[...]&lt;br /&gt;
-- &lt;br /&gt;
Michael     GnuPG fingerprint: 9FF2128B147EF6730BADF133611EC787040B0FAB&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
more info:&lt;br /&gt;
http://roundup.ffmpeg.org/roundup/ffmpeg/issue741&lt;br /&gt;
&lt;br /&gt;
=== lavf-based concatenation tool ===&lt;br /&gt;
&lt;br /&gt;
Unless we have multiple files input in FFmpeg, it would be nice to have some libavformat-based tool that would extract frames from multiple files (possible different containers as well) and put them into single one.&lt;br /&gt;
&lt;br /&gt;
=== cljr and vcr1 encoders ===&lt;br /&gt;
According to this: http://lists.mplayerhq.hu/pipermail/ffmpeg-devel/2009-February/063647.html both of the encoders are disabled, and won't compile if enabled.  Michael would prefer to keep them around, and have someone grow them into full encoders.&lt;br /&gt;
&lt;br /&gt;
=== implement some colorspace fourcc/codecs ===&lt;br /&gt;
some colorspace formats were uploaded to http://samples.mplayerhq.hu/V-codecs/&lt;br /&gt;
including:&lt;br /&gt;
 CYUV.AVI is 8 Bit Interleaved 4:2:2&lt;br /&gt;
 a12v.avi is 4:2:2:4 10 Bit Interleaved&lt;br /&gt;
 auv2.avi is 4:2:2:4 8 Bit Interleaved&lt;br /&gt;
 and V-codecs/yuv8/MAILTEST.AVI .&lt;br /&gt;
&lt;br /&gt;
it might decode with current pixfmts, for that all you will need is:&lt;br /&gt;
 cd ffmpeg&lt;br /&gt;
 svn di -r20378:20379&lt;br /&gt;
&lt;br /&gt;
step by step tutorial for adding new input formats to swscale:&lt;br /&gt;
 cd mplayer/libswscale/&lt;br /&gt;
 svn di -r20426:20427&lt;br /&gt;
 the hunks 3 and 5 you dont need, they are optional special converters&lt;br /&gt;
 also the change to isSupportedOut() you dont need&lt;br /&gt;
 above will add a new input format&lt;br /&gt;
&lt;br /&gt;
another example for adding an input format&lt;br /&gt;
 cd mplayer/libswscale/&lt;br /&gt;
 svn di -r20604:20605&lt;br /&gt;
&lt;br /&gt;
=== Implement Phantom Cine demuxer and Bayer format support for swscale ===&lt;br /&gt;
The format is described here:&lt;br /&gt;
http://wiki.multimedia.cx/index.php?title=Phantom_Cine&lt;br /&gt;
It will need support for Bayer -&amp;gt; RGB conversion in swscale to make the demuxer useful though.&lt;br /&gt;
&lt;br /&gt;
=== Make the rtp demuxer support rtcp BYE packets ===&lt;br /&gt;
rtcp BYE (203) packets are sent from the sender to the receiver to notify that a stream has ended.&lt;br /&gt;
FFmpeg currently ignores them.&lt;br /&gt;
&lt;br /&gt;
Sample url rtsp://media.lscube.org/tests/tc.mov&lt;br /&gt;
&lt;br /&gt;
=== support for [[YCoCg]]/RGB colorspace in FFV1 ===&lt;br /&gt;
Add support for [[YCoCg]] and [[RGB]] encoded sources for the [[FFV1]] codec&lt;br /&gt;
&lt;br /&gt;
This would add a free lossless intra-frame RGB codec for all by FFmpeg supported platforms (most important MacOS + Windows) which is often asked for video editing in video forums (e.g. slashcam.de)&lt;br /&gt;
&lt;br /&gt;
=== Metal Gear Solid Video format demuxer ===&lt;br /&gt;
Write a demuxer to play video files harvested from the game Metal Gear Solid: The Twin Snakes. The format is described on the wiki page [[Metal Gear Solid VP3]] (which also contains links to samples). This page is based on observations and conjecture, so remember to engage the ffmpeg-devel mailing list with questions.&lt;br /&gt;
&lt;br /&gt;
=== [[IFF#ANIM|IFF ANIM]] decoder ===&lt;br /&gt;
Modify libavformat/iff.c to handle this chunk and write a decoder for the format. The wiki page at [[IFF#ANIM|IFF ANIM]] has links to more information and source code. Samples can be found at http://www-user.tu-chemnitz.de/~womar/projects/iffanim/iffanim_samplepack.zip .&lt;br /&gt;
&lt;br /&gt;
=== [[CDXL]] decoder ===&lt;br /&gt;
http://roundup.ffmpeg.org/roundup/ffmpeg/issue1012&lt;br /&gt;
&lt;br /&gt;
Write a decoder for this format using the information in the [[CDXL]] wiki page&lt;br /&gt;
Discussed for the 2009 SoC &lt;br /&gt;
&lt;br /&gt;
=== port missing decoders/demuxers from other open source projects. ===&lt;br /&gt;
&lt;br /&gt;
http://www.mega-nerd.com/libsndfile/#Features&lt;br /&gt;
 Paris Audio File PAF&lt;br /&gt;
 IRCAM SF&lt;br /&gt;
 GNU Octave 2.0 MAT4&lt;br /&gt;
 GNU Octave 2.1 MAT5&lt;br /&gt;
 Portable Voice Format PVFSound&lt;br /&gt;
 Designer II SD2&lt;br /&gt;
samples are here: http://www.mega-nerd.com/tmp/SoundFileCollection-20050711-0902.tgz&lt;br /&gt;
&lt;br /&gt;
http://www.hawksoft.com/hawkvoice/&lt;br /&gt;
 HVDI_VOICE_DATA- packet&lt;br /&gt;
 [[GSM]]&lt;br /&gt;
 LPC&lt;br /&gt;
 CELP&lt;br /&gt;
 LPC10&lt;br /&gt;
&lt;br /&gt;
http://sourceforge.net/projects/vgmstream&lt;br /&gt;
 150+ formats: http://vgmstream.svn.sourceforge.net/viewvc/vgmstream/readme.txt&lt;br /&gt;
&lt;br /&gt;
http://www.imagemagick.org&lt;br /&gt;
http://www.graphicsmagick.org/formats.html&lt;br /&gt;
 many image formats not in ffmpeg yet.&lt;br /&gt;
&lt;br /&gt;
http://gpac.sourceforge.net/&lt;br /&gt;
 [[MPEG-4 BIFS]]&lt;br /&gt;
 3GPP DIMS&lt;br /&gt;
 [[LASeR]]&lt;br /&gt;
 SAF&lt;br /&gt;
 SVG&lt;br /&gt;
 [[Synchronized Multimedia Integration Language|SMIL]]&lt;br /&gt;
 VRML&lt;br /&gt;
 X3D&lt;br /&gt;
 XMT&lt;br /&gt;
&lt;br /&gt;
http://adplug.sourceforge.net/&lt;br /&gt;
http://adplug.sourceforge.net/library/&lt;br /&gt;
 many OPL2/OPL3 audio formats not in ffmpeg yet.&lt;br /&gt;
&lt;br /&gt;
http://mikmod.raphnet.net/&lt;br /&gt;
http://mikmod.raphnet.net/#features&lt;br /&gt;
 many music pattern formats not in ffmpeg yet.&lt;br /&gt;
&lt;br /&gt;
http://www.fly.net/~ant/libs/audio.html#Game_Music_Emu&lt;br /&gt;
 AY&lt;br /&gt;
 GBS&lt;br /&gt;
 GYM&lt;br /&gt;
 HES&lt;br /&gt;
 KSS&lt;br /&gt;
 NSF, NSFE&lt;br /&gt;
 SAP&lt;br /&gt;
 [[SNES-SPC700 Sound Format]]&lt;br /&gt;
 VGM, VGZ&lt;br /&gt;
&lt;br /&gt;
=== port [[Ut Video]] decoder/encoder ===&lt;br /&gt;
gpl v2 decoder/encoder at wiki page&lt;br /&gt;
&lt;br /&gt;
Claimed by shankhs ch as a GSoC 2010 qualification project.&lt;br /&gt;
&lt;br /&gt;
=== Sony psp demuxer ===&lt;br /&gt;
create/port a demuxer for the sony playstation portable format PMP.&lt;br /&gt;
*Samples: http://samples.mplayerhq.hu/playstation/psp/&lt;br /&gt;
*mplayer demuxer: http://mplayer-ww.svn.sourceforge.net/viewvc/mplayer-ww/trunk/mplayer/libmpdemux/demux_pmp.c&lt;br /&gt;
&lt;br /&gt;
=== libswscale PAL8 output ===&lt;br /&gt;
&lt;br /&gt;
See the thread: &amp;quot;[RFC] libswscale palette output implementation&amp;quot;:&lt;br /&gt;
http://thread.gmane.org/gmane.comp.video.ffmpeg.devel/101397&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== vloopback output support ===&lt;br /&gt;
&lt;br /&gt;
vloopback is a linux kernel device which allows to create a virtual video device where&lt;br /&gt;
programs can write, and can be accessed as a normal video device:&lt;br /&gt;
http://www.lavrsen.dk/twiki/bin/view/Motion/VideoFourLinuxLoopbackDevice&lt;br /&gt;
&lt;br /&gt;
This would allow to write the ffmpeg output to a vloopdevice and be displayed by some a&lt;br /&gt;
program reading from such device (e.g. skype, a voip client etc.).&lt;br /&gt;
&lt;br /&gt;
An example of a program which uses vloopback:&lt;br /&gt;
http://www.ws4gl.org/&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Port video filters from MPlayer/VLC/Mjpegtools/Effectv/etc etc to libavfilter ===&lt;br /&gt;
&lt;br /&gt;
There are plenty programs providing their own filters, many of them may be easily ported to the &lt;br /&gt;
superior ;-) framework of libavfilter. Also may be possible to create wrappers around other libraries&lt;br /&gt;
(e.g. opencv, libgimp, libshowphoto, libaa).&lt;br /&gt;
&lt;br /&gt;
=== Add weighted motion compensation for B-frames in RV3/4 ===&lt;br /&gt;
&lt;br /&gt;
RealVideo 3 and 4 uses weighted motion compensation for B-frames (like H.264) but current code&lt;br /&gt;
always uses simple averaging. Reverse-engineer weighted MC functions for B-frames and make&lt;br /&gt;
RV3/4 decoder use them.&lt;br /&gt;
&lt;br /&gt;
[[Category:FFmpeg]]&lt;/div&gt;</summary>
		<author><name>Reimar</name></author>
	</entry>
	<entry>
		<id>https://wiki.multimedia.cx/index.php?title=Phantom_Cine&amp;diff=12462</id>
		<title>Phantom Cine</title>
		<link rel="alternate" type="text/html" href="https://wiki.multimedia.cx/index.php?title=Phantom_Cine&amp;diff=12462"/>
		<updated>2010-03-28T12:09:31Z</updated>

		<summary type="html">&lt;p&gt;Reimar: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The Phantom Cine format&lt;br /&gt;
&lt;br /&gt;
This is a raw uncompressed format.&lt;br /&gt;
&lt;br /&gt;
It contains a lot of metadata, and index and raw uncompressed frames, presumably in Bayer format.&lt;br /&gt;
Know samples store each Bayer sample with 16 bits, where only the lowest 14 bits are relevant (the value 16 can be determined from the BitmapInfoHeader, the 14 bits are somewhere in the file as well, as can be seen from the xml-representation of the metadata (the RealBPP entry) - all data in the xml file exists in the .cine file as well). Making the lowest instead of the highest 14 bits the relevant ones is a rather unfortunate choice since it makes conversion to standard formats more difficult.&lt;br /&gt;
&lt;br /&gt;
Note: some of the fixed offsets below might not always be at that offset, though they are for the known samples and there are no length fields that would allow calculating them in a sensible way.&lt;br /&gt;
&lt;br /&gt;
All values are stored in little-endian format.&lt;br /&gt;
&lt;br /&gt;
The files start with the bytes &amp;quot;CI&amp;quot;, followed by two bytes possibly giving a size or index (all files so far have the value 2c 00).&lt;br /&gt;
&lt;br /&gt;
This &amp;quot;main header&amp;quot; contains this data:&lt;br /&gt;
  2 bytes &amp;quot;CI&amp;quot;&lt;br /&gt;
  2 bytes length&lt;br /&gt;
  2 bytes &amp;quot;compression type&amp;quot; (always 2) ??&lt;br /&gt;
  2 bytes version (always 1)&lt;br /&gt;
  4 bytes number of first frame (usually negative)&lt;br /&gt;
  4 bytes number of frames in files&lt;br /&gt;
  4 bytes number of first frame (repeated?)&lt;br /&gt;
  4 bytes number of frames in files (repeated?)&lt;br /&gt;
  4 bytes offset of BitampInfoHeader&lt;br /&gt;
  4 bytes offset of ???&lt;br /&gt;
  4 bytes offset of file index&lt;br /&gt;
  8 bytes recording time stamp (?)&lt;br /&gt;
&lt;br /&gt;
At offset 0x2c there is a BitmapInfoHeader structure. The first 4 bytes give its length (0x28).&lt;br /&gt;
&lt;br /&gt;
The next sections starts with &amp;quot;ST&amp;quot; at offset 0xe0, again followed by two bytes giving a size or index (so far always 6c 16).&lt;br /&gt;
&lt;br /&gt;
Variable length part seems to start at offset 0x16c0. Note that this is the sum of the possible length files from CI, ST and BitmapInfoHeader (i.e. 0x2c + 0x28 + 0x166c = 0x16c0).&lt;br /&gt;
&lt;br /&gt;
It consists of 3 parts:&lt;br /&gt;
&lt;br /&gt;
1) timestamps&lt;br /&gt;
&lt;br /&gt;
2) exposure values&lt;br /&gt;
&lt;br /&gt;
3) index&lt;br /&gt;
&lt;br /&gt;
The timestamp and exposure values parts start with a 4-byte length field allowing to skip them.&lt;br /&gt;
&lt;br /&gt;
The index contains a 8-byte offset value for each frame. It does not have a length field, the number of entries is determined from the number of frames in the main header.&lt;br /&gt;
&lt;br /&gt;
At each offset indicated by the index, there is first a header. The first 4 bytes of the header give the length of the header. Only the value 8 has a known meaning, in which case the next 4 bytes give the length of the frame data.&lt;/div&gt;</summary>
		<author><name>Reimar</name></author>
	</entry>
	<entry>
		<id>https://wiki.multimedia.cx/index.php?title=Phantom_Cine&amp;diff=12461</id>
		<title>Phantom Cine</title>
		<link rel="alternate" type="text/html" href="https://wiki.multimedia.cx/index.php?title=Phantom_Cine&amp;diff=12461"/>
		<updated>2010-03-28T12:02:35Z</updated>

		<summary type="html">&lt;p&gt;Reimar: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The Phantom Cine format&lt;br /&gt;
&lt;br /&gt;
This is a raw uncompressed format.&lt;br /&gt;
&lt;br /&gt;
It contains a lot of metadata, and index and raw uncompressed frames, presumably in Bayer format.&lt;br /&gt;
&lt;br /&gt;
Note: some of the fixed offsets below might not always be at that offset, though they are for the known samples and there are no length fields that would allow calculating them in a sensible way.&lt;br /&gt;
&lt;br /&gt;
All values are stored in little-endian format.&lt;br /&gt;
&lt;br /&gt;
The files start with the bytes &amp;quot;CI&amp;quot;, followed by two bytes possibly giving a size or index (all files so far have the value 2c 00).&lt;br /&gt;
&lt;br /&gt;
This &amp;quot;main header&amp;quot; contains this data:&lt;br /&gt;
  2 bytes &amp;quot;CI&amp;quot;&lt;br /&gt;
  2 bytes length&lt;br /&gt;
  2 bytes &amp;quot;compression type&amp;quot; (always 2) ??&lt;br /&gt;
  2 bytes version (always 1)&lt;br /&gt;
  4 bytes number of first frame (usually negative)&lt;br /&gt;
  4 bytes number of frames in files&lt;br /&gt;
  4 bytes number of first frame (repeated?)&lt;br /&gt;
  4 bytes number of frames in files (repeated?)&lt;br /&gt;
  4 bytes offset of BitampInfoHeader&lt;br /&gt;
  4 bytes offset of ???&lt;br /&gt;
  4 bytes offset of file index&lt;br /&gt;
  8 bytes recording time stamp (?)&lt;br /&gt;
&lt;br /&gt;
At offset 0x2c there is a BitmapInfoHeader structure. The first 4 bytes give its length (0x28).&lt;br /&gt;
&lt;br /&gt;
The next sections starts with &amp;quot;ST&amp;quot; at offset 0xe0, again followed by two bytes giving a size or index (so far always 6c 16).&lt;br /&gt;
&lt;br /&gt;
Variable length part seems to start at offset 0x16c0. Note that this is the sum of the possible length files from CI, ST and BitmapInfoHeader (i.e. 0x2c + 0x28 + 0x166c = 0x16c0).&lt;br /&gt;
&lt;br /&gt;
It consists of 3 parts:&lt;br /&gt;
&lt;br /&gt;
1) timestamps&lt;br /&gt;
&lt;br /&gt;
2) exposure values&lt;br /&gt;
&lt;br /&gt;
3) index&lt;br /&gt;
&lt;br /&gt;
The timestamp and exposure values parts start with a 4-byte length field allowing to skip them.&lt;br /&gt;
&lt;br /&gt;
The index contains a 8-byte offset value for each frame. It does not have a length field, the number of entries is determined from the number of frames in the main header.&lt;br /&gt;
&lt;br /&gt;
At each offset indicated by the index, there is first a header. The first 4 bytes of the header give the length of the header. Only the value 8 has a known meaning, in which case the next 4 bytes give the length of the frame data.&lt;/div&gt;</summary>
		<author><name>Reimar</name></author>
	</entry>
	<entry>
		<id>https://wiki.multimedia.cx/index.php?title=Phantom_Cine&amp;diff=12460</id>
		<title>Phantom Cine</title>
		<link rel="alternate" type="text/html" href="https://wiki.multimedia.cx/index.php?title=Phantom_Cine&amp;diff=12460"/>
		<updated>2010-03-28T11:43:36Z</updated>

		<summary type="html">&lt;p&gt;Reimar: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The Phantom Cine format&lt;br /&gt;
&lt;br /&gt;
This is a raw uncompressed format.&lt;br /&gt;
&lt;br /&gt;
It contains a lot of metadata, and index and raw uncompressed frames, presumably in Bayer format.&lt;br /&gt;
&lt;br /&gt;
Note: some of the fixed offsets below might not always be at that offset, though they are for the known samples and there are no length fields that would allow calculating them in a sensible way.&lt;br /&gt;
&lt;br /&gt;
Variable length part seems to start at offset 0x16c0.&lt;br /&gt;
&lt;br /&gt;
It consists of 3 parts:&lt;br /&gt;
&lt;br /&gt;
1) timestamps&lt;br /&gt;
&lt;br /&gt;
2) exposure values&lt;br /&gt;
&lt;br /&gt;
3) index&lt;br /&gt;
&lt;br /&gt;
The timestamp and exposure values parts start with a 4-byte length field allowing to skip them.&lt;br /&gt;
&lt;br /&gt;
The index contains a 8-byte offset value for each frame. It does not have a length field, the number of entries is determined from the number of frames in the main header.&lt;br /&gt;
&lt;br /&gt;
At each offset indicated by the index, there is first a header. The first 4 bytes of the header give the length of the header. Only the value 8 has a known meaning, in which case the next 4 bytes give the length of the frame data.&lt;/div&gt;</summary>
		<author><name>Reimar</name></author>
	</entry>
	<entry>
		<id>https://wiki.multimedia.cx/index.php?title=Phantom_Cine&amp;diff=12459</id>
		<title>Phantom Cine</title>
		<link rel="alternate" type="text/html" href="https://wiki.multimedia.cx/index.php?title=Phantom_Cine&amp;diff=12459"/>
		<updated>2010-03-28T11:42:53Z</updated>

		<summary type="html">&lt;p&gt;Reimar: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The Phantom Cine format&lt;br /&gt;
&lt;br /&gt;
This is a raw uncompressed format.&lt;br /&gt;
It contains a lot of metadata, and index and raw uncompressed frames, presumably in Bayer format.&lt;br /&gt;
Note: some of the fixed offsets below might not always be at that offset, though they are for the known samples and there are no length fields that would allow calculating them in a sensible way.&lt;br /&gt;
Variable length part seems to start at offset 0x16c0.&lt;br /&gt;
It consists of 3 parts:&lt;br /&gt;
1) timestamps&lt;br /&gt;
2) exposure values&lt;br /&gt;
3) index&lt;br /&gt;
&lt;br /&gt;
The timestamp and exposure values parts start with a 4-byte length field allowing to skip them.&lt;br /&gt;
The index contains a 8-byte offset value for each frame. It does not have a length field, the number of entries is determined from the number of frames in the main header.&lt;br /&gt;
At each offset indicated by the index, there is first a header. The first 4 bytes of the header give the length of the header. Only the value 8 has a known meaning, in which case the next 4 bytes give the length of the frame data.&lt;/div&gt;</summary>
		<author><name>Reimar</name></author>
	</entry>
	<entry>
		<id>https://wiki.multimedia.cx/index.php?title=MPlayer_youtube_script&amp;diff=12113</id>
		<title>MPlayer youtube script</title>
		<link rel="alternate" type="text/html" href="https://wiki.multimedia.cx/index.php?title=MPlayer_youtube_script&amp;diff=12113"/>
		<updated>2009-12-30T20:54:12Z</updated>

		<summary type="html">&lt;p&gt;Reimar: /* Or just use youtube-dl */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;bash script for playing youtube videos&lt;br /&gt;
&lt;br /&gt;
#save it as youtube.sh&lt;br /&gt;
#chmod +x youtube.sh&lt;br /&gt;
#place it somewhere in PATH (like /usr/local/bin )&lt;br /&gt;
&lt;br /&gt;
Usage:&lt;br /&gt;
*youtube.sh &amp;lt;url&amp;gt; [mplayer args]&lt;br /&gt;
*youtube.sh http://www.youtube.com/watch?v=example -dumpstream -dumpfile something.flv&lt;br /&gt;
*youtube.sh http://www.youtube.com/watch?v=example -aspect 16:9&lt;br /&gt;
*youtube.sh http://www.youtube.com/watch?v=example -xy 2&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
&lt;br /&gt;
if [ -z &amp;quot;$1&amp;quot; ]; then&lt;br /&gt;
        echo &amp;quot;No URL!&amp;quot;&lt;br /&gt;
        exit&lt;br /&gt;
fi&lt;br /&gt;
&lt;br /&gt;
url=$1&lt;br /&gt;
shift&lt;br /&gt;
&lt;br /&gt;
echo \&amp;quot;http://www.youtube.com/get_video?video_id=`wget -q -O - $url | grep fullscreenUrl | awk -F'video_id=' '{ print $2 }' | sed -e 's/ /_/g' | tr -d \'\; `\&amp;quot; | xargs mplayer $*&lt;br /&gt;
&lt;br /&gt;
###&lt;br /&gt;
# The scipt above grabs the html source to the real stream (the .flv file), which youtube constantly alters. The &amp;quot;http://www.youtube.com/get_video?video_id=&amp;quot; is always the prefix -- and the script uses wget to d/l and append the rest of the 'full' url (which is very long and stupid).. an ex.; &lt;br /&gt;
&lt;br /&gt;
#mplayer #&amp;quot;http://www.youtube.com/get_video?video_id=L2SED6sewRw&amp;amp;l=2965&amp;amp;sk=GOT2L_qmpVJOx0w#rud5ycdyuWziPg1lcC&amp;amp;fmt_map=6%2F720000%2F7%2F0%2F0&amp;amp;t=OEgsToPDskLlf4ls3xB6V84dMYLu#ndws&amp;amp;hl=en&amp;amp;plid=AARTXK76vCTnvQ5#JAAAC6ADCAAA&amp;amp;sdetail=rv%253AL2S#ED6sewRw&amp;amp;tk=P4Lg#O65y-u5BllZJBsB_e2Gw-OVgaMp4a8prsHTahDhuPN_xsReW2Q%3D%3D&amp;amp;title=Greg Kroah &lt;br /&gt;
# Hartman on the Linux Kernel&amp;quot;&lt;br /&gt;
# as you can see, there are blank spaces also that need to be cleaned up (blank # # spaces replaced with underscores). &lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
created by enouf and n3kl and amphi&lt;br /&gt;
&lt;br /&gt;
RE-modified by enouf&lt;br /&gt;
&lt;br /&gt;
===Alternative Version===&lt;br /&gt;
Here is an alternative version by [[User:Elte|Elte]].&lt;br /&gt;
&lt;br /&gt;
The differences are&lt;br /&gt;
* fewer programm calls&lt;br /&gt;
* less shell pipes&lt;br /&gt;
* no dependency on grep, awk or tr&lt;br /&gt;
* supports double quotes in video url&lt;br /&gt;
&lt;br /&gt;
This version is nevertheless heavily based on the previous script.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
# © 2008,2009 Elte GPL V2.1&lt;br /&gt;
&lt;br /&gt;
URL=&amp;quot;$1&amp;quot;&lt;br /&gt;
shift&lt;br /&gt;
&lt;br /&gt;
if [ &amp;quot;${URL#http://}&amp;quot; = &amp;quot;$URL&amp;quot; ]&lt;br /&gt;
then&lt;br /&gt;
    echo &amp;quot;usage: $0 &amp;lt;youtube-URL&amp;gt; [mplayer args ...]&amp;quot;&lt;br /&gt;
    exit&lt;br /&gt;
fi&lt;br /&gt;
&lt;br /&gt;
echo \&amp;quot;http://www.youtube.com/get_video?video_id=$(wget -q -O - &amp;quot;$URL&amp;quot;                     \&lt;br /&gt;
                                                   | sed -e '/fullscreenUrl/!d'            \&lt;br /&gt;
                                                         -e &amp;quot;s/.*video_id=\([^']*\).*/\1/&amp;quot; \&lt;br /&gt;
                                                         -e 's/ /_/g'                      \&lt;br /&gt;
                                                         -e 's/\\\&amp;quot;/&amp;quot;/g'                   \&lt;br /&gt;
                                                  )\&amp;quot;                                      \&lt;br /&gt;
| xargs mplayer &amp;quot;$@&amp;quot;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Or just use youtube-dl===&lt;br /&gt;
Available in almost all Linux distributions.&lt;br /&gt;
&lt;br /&gt;
Usage:&lt;br /&gt;
&lt;br /&gt;
mplayer $(youtube-dl -g http://www.youtube.com/watch?v=.....)&lt;br /&gt;
&lt;br /&gt;
or&lt;br /&gt;
&lt;br /&gt;
mplayer $(youtube-dl -b -g http://www.youtube.com/watch?v=.....)&lt;br /&gt;
&lt;br /&gt;
to get the HD version if available.&lt;br /&gt;
&lt;br /&gt;
To avoid seeking issues, try updating to latest SVN. It contains a special hack that works around a bug in youtube's HTTP server (it does not set Accept-Ranges, which incorrectly indicates it does not support seeking/partial requests - MPlayer now just assumes that seeking is possible if the server string is &amp;quot;gvs 1.0&amp;quot; - which is what youtube currently uses).&lt;br /&gt;
&lt;br /&gt;
If that does not help, try configuring MPlayer with http support via FFmpeg (--enable-protocol=&amp;quot;http_protocol&amp;quot; or --enable-protocol=&amp;quot;file_protocol pipe_protocol http_protocol rtmp_protocol tcp_protocol udp_protocol&amp;quot;) and use instead&lt;br /&gt;
&lt;br /&gt;
mplayer ffmpeg://$(youtube-dl -b -g http://www.youtube.com/watch?v=.....)&lt;/div&gt;</summary>
		<author><name>Reimar</name></author>
	</entry>
	<entry>
		<id>https://wiki.multimedia.cx/index.php?title=MPlayer_youtube_script&amp;diff=12112</id>
		<title>MPlayer youtube script</title>
		<link rel="alternate" type="text/html" href="https://wiki.multimedia.cx/index.php?title=MPlayer_youtube_script&amp;diff=12112"/>
		<updated>2009-12-30T20:30:46Z</updated>

		<summary type="html">&lt;p&gt;Reimar: /* Or just use youtube-dl */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;bash script for playing youtube videos&lt;br /&gt;
&lt;br /&gt;
#save it as youtube.sh&lt;br /&gt;
#chmod +x youtube.sh&lt;br /&gt;
#place it somewhere in PATH (like /usr/local/bin )&lt;br /&gt;
&lt;br /&gt;
Usage:&lt;br /&gt;
*youtube.sh &amp;lt;url&amp;gt; [mplayer args]&lt;br /&gt;
*youtube.sh http://www.youtube.com/watch?v=example -dumpstream -dumpfile something.flv&lt;br /&gt;
*youtube.sh http://www.youtube.com/watch?v=example -aspect 16:9&lt;br /&gt;
*youtube.sh http://www.youtube.com/watch?v=example -xy 2&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
&lt;br /&gt;
if [ -z &amp;quot;$1&amp;quot; ]; then&lt;br /&gt;
        echo &amp;quot;No URL!&amp;quot;&lt;br /&gt;
        exit&lt;br /&gt;
fi&lt;br /&gt;
&lt;br /&gt;
url=$1&lt;br /&gt;
shift&lt;br /&gt;
&lt;br /&gt;
echo \&amp;quot;http://www.youtube.com/get_video?video_id=`wget -q -O - $url | grep fullscreenUrl | awk -F'video_id=' '{ print $2 }' | sed -e 's/ /_/g' | tr -d \'\; `\&amp;quot; | xargs mplayer $*&lt;br /&gt;
&lt;br /&gt;
###&lt;br /&gt;
# The scipt above grabs the html source to the real stream (the .flv file), which youtube constantly alters. The &amp;quot;http://www.youtube.com/get_video?video_id=&amp;quot; is always the prefix -- and the script uses wget to d/l and append the rest of the 'full' url (which is very long and stupid).. an ex.; &lt;br /&gt;
&lt;br /&gt;
#mplayer #&amp;quot;http://www.youtube.com/get_video?video_id=L2SED6sewRw&amp;amp;l=2965&amp;amp;sk=GOT2L_qmpVJOx0w#rud5ycdyuWziPg1lcC&amp;amp;fmt_map=6%2F720000%2F7%2F0%2F0&amp;amp;t=OEgsToPDskLlf4ls3xB6V84dMYLu#ndws&amp;amp;hl=en&amp;amp;plid=AARTXK76vCTnvQ5#JAAAC6ADCAAA&amp;amp;sdetail=rv%253AL2S#ED6sewRw&amp;amp;tk=P4Lg#O65y-u5BllZJBsB_e2Gw-OVgaMp4a8prsHTahDhuPN_xsReW2Q%3D%3D&amp;amp;title=Greg Kroah &lt;br /&gt;
# Hartman on the Linux Kernel&amp;quot;&lt;br /&gt;
# as you can see, there are blank spaces also that need to be cleaned up (blank # # spaces replaced with underscores). &lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
created by enouf and n3kl and amphi&lt;br /&gt;
&lt;br /&gt;
RE-modified by enouf&lt;br /&gt;
&lt;br /&gt;
===Alternative Version===&lt;br /&gt;
Here is an alternative version by [[User:Elte|Elte]].&lt;br /&gt;
&lt;br /&gt;
The differences are&lt;br /&gt;
* fewer programm calls&lt;br /&gt;
* less shell pipes&lt;br /&gt;
* no dependency on grep, awk or tr&lt;br /&gt;
* supports double quotes in video url&lt;br /&gt;
&lt;br /&gt;
This version is nevertheless heavily based on the previous script.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
# © 2008,2009 Elte GPL V2.1&lt;br /&gt;
&lt;br /&gt;
URL=&amp;quot;$1&amp;quot;&lt;br /&gt;
shift&lt;br /&gt;
&lt;br /&gt;
if [ &amp;quot;${URL#http://}&amp;quot; = &amp;quot;$URL&amp;quot; ]&lt;br /&gt;
then&lt;br /&gt;
    echo &amp;quot;usage: $0 &amp;lt;youtube-URL&amp;gt; [mplayer args ...]&amp;quot;&lt;br /&gt;
    exit&lt;br /&gt;
fi&lt;br /&gt;
&lt;br /&gt;
echo \&amp;quot;http://www.youtube.com/get_video?video_id=$(wget -q -O - &amp;quot;$URL&amp;quot;                     \&lt;br /&gt;
                                                   | sed -e '/fullscreenUrl/!d'            \&lt;br /&gt;
                                                         -e &amp;quot;s/.*video_id=\([^']*\).*/\1/&amp;quot; \&lt;br /&gt;
                                                         -e 's/ /_/g'                      \&lt;br /&gt;
                                                         -e 's/\\\&amp;quot;/&amp;quot;/g'                   \&lt;br /&gt;
                                                  )\&amp;quot;                                      \&lt;br /&gt;
| xargs mplayer &amp;quot;$@&amp;quot;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Or just use youtube-dl===&lt;br /&gt;
Available in almost all Linux distributions.&lt;br /&gt;
&lt;br /&gt;
Usage:&lt;br /&gt;
&lt;br /&gt;
mplayer $(youtube-dl -g http://www.youtube.com/watch?v=.....)&lt;br /&gt;
&lt;br /&gt;
or&lt;br /&gt;
&lt;br /&gt;
mplayer $(youtube-dl -b -g http://www.youtube.com/watch?v=.....)&lt;br /&gt;
&lt;br /&gt;
to get the HD version if available.&lt;br /&gt;
&lt;br /&gt;
To resolve seeking issues, try configuring MPlayer with http support via FFmpeg (--enable-protocol=&amp;quot;http_protocol&amp;quot; or --enable-protocol=&amp;quot;file_protocol pipe_protocol http_protocol rtmp_protocol tcp_protocol udp_protocol&amp;quot;) and use instead&lt;br /&gt;
&lt;br /&gt;
mplayer ffmpeg://$(youtube-dl -b -g http://www.youtube.com/watch?v=.....)&lt;/div&gt;</summary>
		<author><name>Reimar</name></author>
	</entry>
	<entry>
		<id>https://wiki.multimedia.cx/index.php?title=MPlayer_youtube_script&amp;diff=12111</id>
		<title>MPlayer youtube script</title>
		<link rel="alternate" type="text/html" href="https://wiki.multimedia.cx/index.php?title=MPlayer_youtube_script&amp;diff=12111"/>
		<updated>2009-12-30T20:25:24Z</updated>

		<summary type="html">&lt;p&gt;Reimar: /* Or just use youtube-dl */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;bash script for playing youtube videos&lt;br /&gt;
&lt;br /&gt;
#save it as youtube.sh&lt;br /&gt;
#chmod +x youtube.sh&lt;br /&gt;
#place it somewhere in PATH (like /usr/local/bin )&lt;br /&gt;
&lt;br /&gt;
Usage:&lt;br /&gt;
*youtube.sh &amp;lt;url&amp;gt; [mplayer args]&lt;br /&gt;
*youtube.sh http://www.youtube.com/watch?v=example -dumpstream -dumpfile something.flv&lt;br /&gt;
*youtube.sh http://www.youtube.com/watch?v=example -aspect 16:9&lt;br /&gt;
*youtube.sh http://www.youtube.com/watch?v=example -xy 2&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
&lt;br /&gt;
if [ -z &amp;quot;$1&amp;quot; ]; then&lt;br /&gt;
        echo &amp;quot;No URL!&amp;quot;&lt;br /&gt;
        exit&lt;br /&gt;
fi&lt;br /&gt;
&lt;br /&gt;
url=$1&lt;br /&gt;
shift&lt;br /&gt;
&lt;br /&gt;
echo \&amp;quot;http://www.youtube.com/get_video?video_id=`wget -q -O - $url | grep fullscreenUrl | awk -F'video_id=' '{ print $2 }' | sed -e 's/ /_/g' | tr -d \'\; `\&amp;quot; | xargs mplayer $*&lt;br /&gt;
&lt;br /&gt;
###&lt;br /&gt;
# The scipt above grabs the html source to the real stream (the .flv file), which youtube constantly alters. The &amp;quot;http://www.youtube.com/get_video?video_id=&amp;quot; is always the prefix -- and the script uses wget to d/l and append the rest of the 'full' url (which is very long and stupid).. an ex.; &lt;br /&gt;
&lt;br /&gt;
#mplayer #&amp;quot;http://www.youtube.com/get_video?video_id=L2SED6sewRw&amp;amp;l=2965&amp;amp;sk=GOT2L_qmpVJOx0w#rud5ycdyuWziPg1lcC&amp;amp;fmt_map=6%2F720000%2F7%2F0%2F0&amp;amp;t=OEgsToPDskLlf4ls3xB6V84dMYLu#ndws&amp;amp;hl=en&amp;amp;plid=AARTXK76vCTnvQ5#JAAAC6ADCAAA&amp;amp;sdetail=rv%253AL2S#ED6sewRw&amp;amp;tk=P4Lg#O65y-u5BllZJBsB_e2Gw-OVgaMp4a8prsHTahDhuPN_xsReW2Q%3D%3D&amp;amp;title=Greg Kroah &lt;br /&gt;
# Hartman on the Linux Kernel&amp;quot;&lt;br /&gt;
# as you can see, there are blank spaces also that need to be cleaned up (blank # # spaces replaced with underscores). &lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
created by enouf and n3kl and amphi&lt;br /&gt;
&lt;br /&gt;
RE-modified by enouf&lt;br /&gt;
&lt;br /&gt;
===Alternative Version===&lt;br /&gt;
Here is an alternative version by [[User:Elte|Elte]].&lt;br /&gt;
&lt;br /&gt;
The differences are&lt;br /&gt;
* fewer programm calls&lt;br /&gt;
* less shell pipes&lt;br /&gt;
* no dependency on grep, awk or tr&lt;br /&gt;
* supports double quotes in video url&lt;br /&gt;
&lt;br /&gt;
This version is nevertheless heavily based on the previous script.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
# © 2008,2009 Elte GPL V2.1&lt;br /&gt;
&lt;br /&gt;
URL=&amp;quot;$1&amp;quot;&lt;br /&gt;
shift&lt;br /&gt;
&lt;br /&gt;
if [ &amp;quot;${URL#http://}&amp;quot; = &amp;quot;$URL&amp;quot; ]&lt;br /&gt;
then&lt;br /&gt;
    echo &amp;quot;usage: $0 &amp;lt;youtube-URL&amp;gt; [mplayer args ...]&amp;quot;&lt;br /&gt;
    exit&lt;br /&gt;
fi&lt;br /&gt;
&lt;br /&gt;
echo \&amp;quot;http://www.youtube.com/get_video?video_id=$(wget -q -O - &amp;quot;$URL&amp;quot;                     \&lt;br /&gt;
                                                   | sed -e '/fullscreenUrl/!d'            \&lt;br /&gt;
                                                         -e &amp;quot;s/.*video_id=\([^']*\).*/\1/&amp;quot; \&lt;br /&gt;
                                                         -e 's/ /_/g'                      \&lt;br /&gt;
                                                         -e 's/\\\&amp;quot;/&amp;quot;/g'                   \&lt;br /&gt;
                                                  )\&amp;quot;                                      \&lt;br /&gt;
| xargs mplayer &amp;quot;$@&amp;quot;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Or just use youtube-dl===&lt;br /&gt;
Available in almost all Linux distributions.&lt;br /&gt;
&lt;br /&gt;
Usage:&lt;br /&gt;
&lt;br /&gt;
mplayer $(youtube-dl -g http://www.youtube.com/watch?v=.....)&lt;br /&gt;
&lt;br /&gt;
or&lt;br /&gt;
&lt;br /&gt;
mplayer $(youtube-dl -b -g http://www.youtube.com/watch?v=.....)&lt;br /&gt;
&lt;br /&gt;
to get the HD version if available.&lt;/div&gt;</summary>
		<author><name>Reimar</name></author>
	</entry>
	<entry>
		<id>https://wiki.multimedia.cx/index.php?title=MPlayer_youtube_script&amp;diff=12110</id>
		<title>MPlayer youtube script</title>
		<link rel="alternate" type="text/html" href="https://wiki.multimedia.cx/index.php?title=MPlayer_youtube_script&amp;diff=12110"/>
		<updated>2009-12-30T20:21:54Z</updated>

		<summary type="html">&lt;p&gt;Reimar: Or just use youtube-dl...&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;bash script for playing youtube videos&lt;br /&gt;
&lt;br /&gt;
#save it as youtube.sh&lt;br /&gt;
#chmod +x youtube.sh&lt;br /&gt;
#place it somewhere in PATH (like /usr/local/bin )&lt;br /&gt;
&lt;br /&gt;
Usage:&lt;br /&gt;
*youtube.sh &amp;lt;url&amp;gt; [mplayer args]&lt;br /&gt;
*youtube.sh http://www.youtube.com/watch?v=example -dumpstream -dumpfile something.flv&lt;br /&gt;
*youtube.sh http://www.youtube.com/watch?v=example -aspect 16:9&lt;br /&gt;
*youtube.sh http://www.youtube.com/watch?v=example -xy 2&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
&lt;br /&gt;
if [ -z &amp;quot;$1&amp;quot; ]; then&lt;br /&gt;
        echo &amp;quot;No URL!&amp;quot;&lt;br /&gt;
        exit&lt;br /&gt;
fi&lt;br /&gt;
&lt;br /&gt;
url=$1&lt;br /&gt;
shift&lt;br /&gt;
&lt;br /&gt;
echo \&amp;quot;http://www.youtube.com/get_video?video_id=`wget -q -O - $url | grep fullscreenUrl | awk -F'video_id=' '{ print $2 }' | sed -e 's/ /_/g' | tr -d \'\; `\&amp;quot; | xargs mplayer $*&lt;br /&gt;
&lt;br /&gt;
###&lt;br /&gt;
# The scipt above grabs the html source to the real stream (the .flv file), which youtube constantly alters. The &amp;quot;http://www.youtube.com/get_video?video_id=&amp;quot; is always the prefix -- and the script uses wget to d/l and append the rest of the 'full' url (which is very long and stupid).. an ex.; &lt;br /&gt;
&lt;br /&gt;
#mplayer #&amp;quot;http://www.youtube.com/get_video?video_id=L2SED6sewRw&amp;amp;l=2965&amp;amp;sk=GOT2L_qmpVJOx0w#rud5ycdyuWziPg1lcC&amp;amp;fmt_map=6%2F720000%2F7%2F0%2F0&amp;amp;t=OEgsToPDskLlf4ls3xB6V84dMYLu#ndws&amp;amp;hl=en&amp;amp;plid=AARTXK76vCTnvQ5#JAAAC6ADCAAA&amp;amp;sdetail=rv%253AL2S#ED6sewRw&amp;amp;tk=P4Lg#O65y-u5BllZJBsB_e2Gw-OVgaMp4a8prsHTahDhuPN_xsReW2Q%3D%3D&amp;amp;title=Greg Kroah &lt;br /&gt;
# Hartman on the Linux Kernel&amp;quot;&lt;br /&gt;
# as you can see, there are blank spaces also that need to be cleaned up (blank # # spaces replaced with underscores). &lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
created by enouf and n3kl and amphi&lt;br /&gt;
&lt;br /&gt;
RE-modified by enouf&lt;br /&gt;
&lt;br /&gt;
===Alternative Version===&lt;br /&gt;
Here is an alternative version by [[User:Elte|Elte]].&lt;br /&gt;
&lt;br /&gt;
The differences are&lt;br /&gt;
* fewer programm calls&lt;br /&gt;
* less shell pipes&lt;br /&gt;
* no dependency on grep, awk or tr&lt;br /&gt;
* supports double quotes in video url&lt;br /&gt;
&lt;br /&gt;
This version is nevertheless heavily based on the previous script.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
# © 2008,2009 Elte GPL V2.1&lt;br /&gt;
&lt;br /&gt;
URL=&amp;quot;$1&amp;quot;&lt;br /&gt;
shift&lt;br /&gt;
&lt;br /&gt;
if [ &amp;quot;${URL#http://}&amp;quot; = &amp;quot;$URL&amp;quot; ]&lt;br /&gt;
then&lt;br /&gt;
    echo &amp;quot;usage: $0 &amp;lt;youtube-URL&amp;gt; [mplayer args ...]&amp;quot;&lt;br /&gt;
    exit&lt;br /&gt;
fi&lt;br /&gt;
&lt;br /&gt;
echo \&amp;quot;http://www.youtube.com/get_video?video_id=$(wget -q -O - &amp;quot;$URL&amp;quot;                     \&lt;br /&gt;
                                                   | sed -e '/fullscreenUrl/!d'            \&lt;br /&gt;
                                                         -e &amp;quot;s/.*video_id=\([^']*\).*/\1/&amp;quot; \&lt;br /&gt;
                                                         -e 's/ /_/g'                      \&lt;br /&gt;
                                                         -e 's/\\\&amp;quot;/&amp;quot;/g'                   \&lt;br /&gt;
                                                  )\&amp;quot;                                      \&lt;br /&gt;
| xargs mplayer &amp;quot;$@&amp;quot;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
===Or just use youtube-dl===&lt;br /&gt;
Available in almost all Linux distributions.&lt;br /&gt;
Usage:&lt;br /&gt;
mplayer $(youtube-dl -g http://www.youtube.com/watch?v=.....)&lt;/div&gt;</summary>
		<author><name>Reimar</name></author>
	</entry>
	<entry>
		<id>https://wiki.multimedia.cx/index.php?title=IMA_ADPCM&amp;diff=11727</id>
		<title>IMA ADPCM</title>
		<link rel="alternate" type="text/html" href="https://wiki.multimedia.cx/index.php?title=IMA_ADPCM&amp;diff=11727"/>
		<updated>2009-06-24T13:23:42Z</updated>

		<summary type="html">&lt;p&gt;Reimar: /* Optimization */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The [[Interactive Multimedia Association]] (IMA) developed an ADPCM algorithm designed to be used in entertainment multimedia applications. It is particularly fast to encode and decode and does not strictly require any multiplications or floating point operations.&lt;br /&gt;
&lt;br /&gt;
While the encoding and decoding algorithms remain more or less constant across different IMA implementations, the specific on-disk data formats vary. This page describes the common IMA decoding algorithm. See the [[:Category:IMA ADPCM Audio Codecs]] page for various formats used for storing the data on disk.&lt;br /&gt;
&lt;br /&gt;
== Decoding IMA ==&lt;br /&gt;
To decode IMA ADPCM, initialize 3 variables:&lt;br /&gt;
&lt;br /&gt;
* predictor: This is either initialized from the data chunk preamble specified in the format or is initialized to 0 at the start of the decoding process.&lt;br /&gt;
* step index: Similar to the initial predictor, this variable is initialized from the data chunk preamble or set to 0 at the start of the decoding process.&lt;br /&gt;
* step: This variable is initialized to ima_step_table[step_index].&lt;br /&gt;
&lt;br /&gt;
The encoded IMA bitstream is comprised of a series of 4-bit nibbles. This means that each byte represents 2 IMA nibbles. The specific data format will dictate whether the stream is decoded top nibble first or bottom nibble first, and whether there is stereo interleaving within the IMA nibbles. For this discussion, imagine the IMA bitstream as a series of nibbles representing a single audio channel:&lt;br /&gt;
&lt;br /&gt;
 n0 n1 n2 n3 n4 n5 ... &lt;br /&gt;
&lt;br /&gt;
Where each nibble represents both a table index and a sign/magnitude number during the decoding process. Transform each nibble in the stream into a signed, 16-bit PCM sample using the following process:&lt;br /&gt;
&lt;br /&gt;
 step_index = step_index + ima_index_table[(unsigned)nibble]&lt;br /&gt;
 &lt;br /&gt;
 diff = ((signed)nibble + 0.5) * step / 4&lt;br /&gt;
 &lt;br /&gt;
 predictor = predictor + diff&lt;br /&gt;
 &lt;br /&gt;
 step = ima_step_table[step index]&lt;br /&gt;
&lt;br /&gt;
Regarding the step index and predictor calculations: Be sure to saturate the computed step index between 0 and 88 (table limits) and the predictor between -32768 and 32767 (signed 16-bit number range). It is possible for these values to outrange which could cause undesirable program behavior if unchecked.&lt;br /&gt;
&lt;br /&gt;
== Optimization ==&lt;br /&gt;
&lt;br /&gt;
A note about the following calculation:&lt;br /&gt;
&lt;br /&gt;
 diff = ((sign/mag.)nibble + 0.5) * step / 4&lt;br /&gt;
	&lt;br /&gt;
At first glance, it appears that this calculation requires floating point operations and an arbitrary (not power-of-2) multiplication. However, some numerical manipulations reveal some useful simplifications:&lt;br /&gt;
&lt;br /&gt;
 diff = ((step * nibble) + (step / 2)) / 4&lt;br /&gt;
 &lt;br /&gt;
 diff =	(step * nibble / 4) + (step / 8)&lt;br /&gt;
	&lt;br /&gt;
The step / 8 calculation can be expressed as a bit shift right by 3 (step SHR 3). The first part of the equation can also be simplified. Since a nibble only carries 4 bits, and those 4 bits are a sign/magnitude number, there are only 3 bits of magnitude information. If all 3 magnitude bits are set to 1:&lt;br /&gt;
&lt;br /&gt;
 nibble = 4 + 2 + 1&lt;br /&gt;
 &lt;br /&gt;
 step * nibble / 4 = (4 * step / 4) + (2 * step / 4) + (1 * step / 4) = step + (step / 2) + (step / 4)&lt;br /&gt;
&lt;br /&gt;
Thus, if bit 2 of the nibble is set, add step to diff. If bit 1 is set, add (step / 2 = step SHR 1) to diff. If bit 0 is set, add (step / 4 = step SHR 2) to diff. Finally, if the sign bit is set, subtract the final diff value from the predictor value; otherwise, add the final diff value to the predictor value. The usual algorithm is as follows:&lt;br /&gt;
&lt;br /&gt;
 sign = nibble &amp;amp; 8&lt;br /&gt;
 delta = nibble &amp;amp; 7&lt;br /&gt;
 diff = step &amp;gt;&amp;gt; 3&lt;br /&gt;
 if (delta &amp;amp; 4) diff += step&lt;br /&gt;
 if (delta &amp;amp; 2) diff += (step &amp;gt;&amp;gt; 1)&lt;br /&gt;
 if (delta &amp;amp; 1) diff += (step &amp;gt;&amp;gt; 2)&lt;br /&gt;
 if (sign) predictor -= diff&lt;br /&gt;
 else predictor += diff &lt;br /&gt;
&lt;br /&gt;
This method was particularly useful back when IMA was implemented on commodity CPUs which were relatively slow at multiplication. One multiplication per audio sample had a notable impact on program performance, as opposed to the series of branches, additions and logical bit operations. If multiplication performance is not an issue, it is possible to carry out the diff calculation with only one non-power-of-2 multiplication and no floating point numbers:&lt;br /&gt;
&lt;br /&gt;
 diff = ((((signed)nibble+0.5) * step) / 4) * (2 / 2)&lt;br /&gt;
 &lt;br /&gt;
 diff = (nibble + 0.5) * 2 * step / 8&lt;br /&gt;
 &lt;br /&gt;
 diff = (2 * nibble + 1) * step / 8&lt;br /&gt;
&lt;br /&gt;
[NOTE: something seems wrong here, e.g. for nibble = 3 and step = 3 we would get with the upper formula&lt;br /&gt;
diff = 0 + 1 + 0 = 1&lt;br /&gt;
and with the lower one&lt;br /&gt;
diff = (2 * 3 + 1) * 3 / 8 = 21 / 8 = 2&lt;br /&gt;
Since with ADPCM errors can propagate forever that seems like a really bad thing...&lt;br /&gt;
]&lt;br /&gt;
&lt;br /&gt;
== Decoding Tables ==&lt;br /&gt;
&lt;br /&gt;
 int index_table[16] = {&lt;br /&gt;
   -1, -1, -1, -1, 2, 4, 6, 8,&lt;br /&gt;
   -1, -1, -1, -1, 2, 4, 6, 8&lt;br /&gt;
 }; &lt;br /&gt;
&lt;br /&gt;
Note that many programs use slight deviations from the following table, but such deviations are negligible:&lt;br /&gt;
&lt;br /&gt;
 int step_table[89] = { &lt;br /&gt;
   7, 8, 9, 10, 11, 12, 13, 14, 16, 17, &lt;br /&gt;
   19, 21, 23, 25, 28, 31, 34, 37, 41, 45, &lt;br /&gt;
   50, 55, 60, 66, 73, 80, 88, 97, 107, 118, &lt;br /&gt;
   130, 143, 157, 173, 190, 209, 230, 253, 279, 307,&lt;br /&gt;
   337, 371, 408, 449, 494, 544, 598, 658, 724, 796,&lt;br /&gt;
   876, 963, 1060, 1166, 1282, 1411, 1552, 1707, 1878, 2066, &lt;br /&gt;
   2272, 2499, 2749, 3024, 3327, 3660, 4026, 4428, 4871, 5358,&lt;br /&gt;
   5894, 6484, 7132, 7845, 8630, 9493, 10442, 11487, 12635, 13899, &lt;br /&gt;
   15289, 16818, 18500, 20350, 22385, 24623, 27086, 29794, 32767 &lt;br /&gt;
 }; &lt;br /&gt;
&lt;br /&gt;
[[Category:Audio Codecs]]&lt;br /&gt;
[[Category:ADPCM Audio Codecs]]&lt;br /&gt;
[[Category:IMA ADPCM Audio Codecs]]&lt;/div&gt;</summary>
		<author><name>Reimar</name></author>
	</entry>
	<entry>
		<id>https://wiki.multimedia.cx/index.php?title=UMV&amp;diff=11587</id>
		<title>UMV</title>
		<link rel="alternate" type="text/html" href="https://wiki.multimedia.cx/index.php?title=UMV&amp;diff=11587"/>
		<updated>2009-05-15T10:51:24Z</updated>

		<summary type="html">&lt;p&gt;Reimar: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;* Extension: UMV&lt;br /&gt;
* Samples: http://samples.mplayerhq.hu/game-formats/umv/&lt;br /&gt;
&lt;br /&gt;
UMV is (suspected to be) a full motion video file format used in the DOS game [http://www.mobygames.com/game/dos/are-you-afraid-of-the-dark-the-tale-of-orpheos-curse Are You Afraid of the Dark? The Tale of Orpheo's Curse].&lt;br /&gt;
&lt;br /&gt;
[[Category:Video Codecs]]&lt;br /&gt;
[[Category:Game Formats]]&lt;br /&gt;
[[Category:Undiscovered Game Formats]]&lt;br /&gt;
&lt;br /&gt;
The format is split into individual &amp;quot;packets&amp;quot;, with no special header.&lt;br /&gt;
Each packet starts with 4 bytes packet size (big-endian, including the 4 bytes for the size itself), followed by the size value of the previous packet (0 if there is no previous one).&lt;br /&gt;
The following 4 bytes are probably some kind of &amp;quot;packet type&amp;quot; value.&lt;br /&gt;
Packets are grouped together, with the last packet in a group having a size value of 0 and type 00 00 00 80.&lt;br /&gt;
After each packet group follows a section of PCM audio samples, up to the start of the next packet group.&lt;br /&gt;
The start offset of the next packet group is coded in the first packet of the current group, such a packet starts e.g. like this:&lt;br /&gt;
00 00 00 40 00 00 00 00 00 00 00 02 00 02 c8 00&lt;br /&gt;
where 0x40 is the size of that first packet, 0 is the size of the previous one, the packet &amp;quot;type&amp;quot; is 2 and 0x2c800 is the start offset of the next packet group.&lt;br /&gt;
&lt;br /&gt;
The PCM &amp;quot;packets&amp;quot; are strange, e.g. in V_END7.UMV most packets seem to have only a size of 8820 bytes, the remaining bytes are filled up with the value 0x80 - whereas the e.g. first PCM packet and the one starting at 0x47f2c seem to contain audio data all the way.&lt;br /&gt;
Possibly the value 0x80 is not allowed in the signed PCM format used but instead indicates padding/values to be dropped?&lt;br /&gt;
&lt;br /&gt;
Video seems to be a paletted format, where the palette is stored in packets of type 0x40 in a 1-byte per component, 3 component format (i.e. 768 bytes for a full palette). It is possible that these palette packets specify the first and last palette entry they replace, so only parts of the palette can be changed.&lt;br /&gt;
Packets of type 0x20 might code full frames/keyframes at a resolution of 160x120.&lt;/div&gt;</summary>
		<author><name>Reimar</name></author>
	</entry>
	<entry>
		<id>https://wiki.multimedia.cx/index.php?title=UMV&amp;diff=11586</id>
		<title>UMV</title>
		<link rel="alternate" type="text/html" href="https://wiki.multimedia.cx/index.php?title=UMV&amp;diff=11586"/>
		<updated>2009-05-15T10:40:27Z</updated>

		<summary type="html">&lt;p&gt;Reimar: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;* Extension: UMV&lt;br /&gt;
* Samples: http://samples.mplayerhq.hu/game-formats/umv/&lt;br /&gt;
&lt;br /&gt;
UMV is (suspected to be) a full motion video file format used in the DOS game [http://www.mobygames.com/game/dos/are-you-afraid-of-the-dark-the-tale-of-orpheos-curse Are You Afraid of the Dark? The Tale of Orpheo's Curse].&lt;br /&gt;
&lt;br /&gt;
[[Category:Video Codecs]]&lt;br /&gt;
[[Category:Game Formats]]&lt;br /&gt;
[[Category:Undiscovered Game Formats]]&lt;br /&gt;
&lt;br /&gt;
The format is split into individual &amp;quot;packets&amp;quot;, with no special header.&lt;br /&gt;
Each packet starts with 4 bytes packet size (big-endian, including the 4 bytes for the size itself), followed by the size value of the previous packet (0 if there is no previous one).&lt;br /&gt;
The following 4 bytes are probably some kind of &amp;quot;packet type&amp;quot; value.&lt;br /&gt;
Packets are grouped together, with the last packet in a group having a size value of 0 and type 00 00 00 80.&lt;br /&gt;
After each packet group follows a section of PCM audio samples, up to the start of the next packet group.&lt;br /&gt;
The start offset of the next packet group is coded in the first packet of the current group, such a packet starts e.g. like this:&lt;br /&gt;
00 00 00 40 00 00 00 00 00 00 00 02 00 02 c8 00&lt;br /&gt;
where 0x40 is the size of that first packet, 0 is the size of the previous one, the packet &amp;quot;type&amp;quot; is 2 and 0x2c800 is the start offset of the next packet group.&lt;br /&gt;
&lt;br /&gt;
The PCM &amp;quot;packets&amp;quot; are strange, e.g. in V_END7.UMV most packets seem to have only a size of 8820 bytes, the remaining bytes are filled up with the value 0x80 - whereas the e.g. first PCM packet and the one starting at 0x47f2c seem to contain audio data all the way.&lt;br /&gt;
Possibly the value 0x80 is not allowed in the signed PCM format used but instead indicates padding/values to be dropped?&lt;/div&gt;</summary>
		<author><name>Reimar</name></author>
	</entry>
	<entry>
		<id>https://wiki.multimedia.cx/index.php?title=UMV&amp;diff=11585</id>
		<title>UMV</title>
		<link rel="alternate" type="text/html" href="https://wiki.multimedia.cx/index.php?title=UMV&amp;diff=11585"/>
		<updated>2009-05-15T10:25:00Z</updated>

		<summary type="html">&lt;p&gt;Reimar: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;* Extension: UMV&lt;br /&gt;
* Samples: http://samples.mplayerhq.hu/game-formats/umv/&lt;br /&gt;
&lt;br /&gt;
UMV is (suspected to be) a full motion video file format used in the DOS game [http://www.mobygames.com/game/dos/are-you-afraid-of-the-dark-the-tale-of-orpheos-curse Are You Afraid of the Dark? The Tale of Orpheo's Curse].&lt;br /&gt;
&lt;br /&gt;
[[Category:Video Codecs]]&lt;br /&gt;
[[Category:Game Formats]]&lt;br /&gt;
[[Category:Undiscovered Game Formats]]&lt;br /&gt;
&lt;br /&gt;
The format is split into individual &amp;quot;packets&amp;quot;, with no special header.&lt;br /&gt;
Each packet starts with 4 bytes packet size (big-endian, including the 4 bytes for the size itself), followed by the size value of the previous packet (0 if there is no previous one).&lt;br /&gt;
The following 4 bytes are probably some kind of &amp;quot;packet type&amp;quot; value.&lt;br /&gt;
Packets are grouped together, with the last packet in a group having a size value of 0 and type 00 00 00 80.&lt;br /&gt;
After each packet group follows a section of PCM audio samples, up to the start of the next packet group.&lt;br /&gt;
The start offset of the next packet group is coded in the first packet of the current group, such a packet starts e.g. like this:&lt;br /&gt;
00 00 00 40 00 00 00 00 00 00 00 02 00 02 c8 00&lt;br /&gt;
where 0x40 is the size of that first packet, 0 is the size of the previous one, the packet &amp;quot;type&amp;quot; is 2 and 0x2c800 is the start offset of the next packet group.&lt;/div&gt;</summary>
		<author><name>Reimar</name></author>
	</entry>
	<entry>
		<id>https://wiki.multimedia.cx/index.php?title=UMV&amp;diff=11581</id>
		<title>UMV</title>
		<link rel="alternate" type="text/html" href="https://wiki.multimedia.cx/index.php?title=UMV&amp;diff=11581"/>
		<updated>2009-05-14T21:13:57Z</updated>

		<summary type="html">&lt;p&gt;Reimar: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;* Extension: UMV&lt;br /&gt;
* Samples: http://samples.mplayerhq.hu/game-formats/umv/&lt;br /&gt;
&lt;br /&gt;
UMV is (suspected to be) a full motion video file format used in the DOS game [http://www.mobygames.com/game/dos/are-you-afraid-of-the-dark-the-tale-of-orpheos-curse Are You Afraid of the Dark? The Tale of Orpheo's Curse].&lt;br /&gt;
&lt;br /&gt;
[[Category:Video Codecs]]&lt;br /&gt;
[[Category:Game Formats]]&lt;br /&gt;
[[Category:Undiscovered Game Formats]]&lt;br /&gt;
&lt;br /&gt;
The format is split into individual &amp;quot;packets&amp;quot;, with no special header.&lt;br /&gt;
Each packet starts with 4 bytes packet size (big-endian, including the 4 bytes for the size itself), followed by the size value of the previous packet (0 if there is no previous one).&lt;/div&gt;</summary>
		<author><name>Reimar</name></author>
	</entry>
	<entry>
		<id>https://wiki.multimedia.cx/index.php?title=UMV&amp;diff=11580</id>
		<title>UMV</title>
		<link rel="alternate" type="text/html" href="https://wiki.multimedia.cx/index.php?title=UMV&amp;diff=11580"/>
		<updated>2009-05-14T21:12:26Z</updated>

		<summary type="html">&lt;p&gt;Reimar: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;* Extension: UMV&lt;br /&gt;
* Samples: http://samples.mplayerhq.hu/game-formats/umv/&lt;br /&gt;
&lt;br /&gt;
UMV is (suspected to be) a full motion video file format used in the DOS game [http://www.mobygames.com/game/dos/are-you-afraid-of-the-dark-the-tale-of-orpheos-curse Are You Afraid of the Dark? The Tale of Orpheo's Curse].&lt;br /&gt;
&lt;br /&gt;
[[Category:Video Codecs]]&lt;br /&gt;
[[Category:Game Formats]]&lt;br /&gt;
[[Category:Undiscovered Game Formats]]&lt;br /&gt;
&lt;br /&gt;
The format is split into individual &amp;quot;packets&amp;quot;, with no special header.&lt;br /&gt;
Each packet starts with 4 bytes packet size (little-endian, including the size itself), followed by the size value of the previous packet (0 if there is no previous one).&lt;/div&gt;</summary>
		<author><name>Reimar</name></author>
	</entry>
	<entry>
		<id>https://wiki.multimedia.cx/index.php?title=CDXL&amp;diff=11494</id>
		<title>CDXL</title>
		<link rel="alternate" type="text/html" href="https://wiki.multimedia.cx/index.php?title=CDXL&amp;diff=11494"/>
		<updated>2009-04-08T14:09:30Z</updated>

		<summary type="html">&lt;p&gt;Reimar: /* Video */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;* Company: [[Commodore]]&lt;br /&gt;
* Patents: US 5,293,606, &amp;quot;Apparatus and method for transferring interleaved data objects in mass storage devices into separate destinations in memory&amp;quot;&lt;br /&gt;
* See also: [http://en.wikipedia.org/wiki/CDXL Wikipedia CDXL article], [http://web.archive.org/web/20030108081402/http://home.t-online.de/home/K_Andreas/CDXL.HTM German article with technical details] (archive.org)&lt;br /&gt;
* Amiga decoder with source code in Pascal: http://aminet.net/package/gfx/show/AnimFX&lt;br /&gt;
* Sample: ftp://ffmpeg.org/MPlayer/samples/game-formats/cdxl/ , http://valerio.diinoweb.com/files/amigaanimecd/index.htm&lt;br /&gt;
&lt;br /&gt;
CDXL is an uncompressed video format created by Commodore for playback from CD-ROM on Amiga computers.&lt;br /&gt;
&lt;br /&gt;
== File format ==&lt;br /&gt;
&lt;br /&gt;
CDXL files have no identifying markers or file headers; the file contains a number of frames, each prefixed by a frame header.&lt;br /&gt;
&lt;br /&gt;
  Frame header | Palette | Video | Audio&lt;br /&gt;
&lt;br /&gt;
All multi-byte integers are big endian.&lt;br /&gt;
&lt;br /&gt;
== Frame header ==&lt;br /&gt;
&lt;br /&gt;
  byte 0        File type&lt;br /&gt;
  byte 1        Info byte&lt;br /&gt;
    bits 0-2      Video encoding&lt;br /&gt;
    bit 3         Stereo flag&lt;br /&gt;
    bits 5-7      Plane arrangement&lt;br /&gt;
  bytes 2-5     Current chunk size&lt;br /&gt;
  bytes 6-9     Previous chunk size&lt;br /&gt;
  bytes 10-11   Reserved&lt;br /&gt;
  bytes 12-13   Current frame number (1 for first frame)&lt;br /&gt;
  bytes 14-15   Video width&lt;br /&gt;
  bytes 16-17   Video height&lt;br /&gt;
  bytes 18-19   Number of bit planes&lt;br /&gt;
  bytes 20-21   Palette size in bytes&lt;br /&gt;
  bytes 22-23   Sound size in bytes&lt;br /&gt;
  bytes 24-31   Reserved&lt;br /&gt;
&lt;br /&gt;
File type values (it is unknown what these mean):&lt;br /&gt;
&lt;br /&gt;
  0  Custom CDXL&lt;br /&gt;
  1  Standard CDXL&lt;br /&gt;
  2  Special CDXL&lt;br /&gt;
&lt;br /&gt;
Video encoding values:&lt;br /&gt;
&lt;br /&gt;
  0  RGB&lt;br /&gt;
  1  HAM&lt;br /&gt;
  2  YUV&lt;br /&gt;
  3  AVM &amp;amp; DCTV&lt;br /&gt;
&lt;br /&gt;
Plane arrangement values:&lt;br /&gt;
  0  Bit planar&lt;br /&gt;
  1  Byte planar&lt;br /&gt;
  2  Chunky&lt;br /&gt;
  4  Bit line&lt;br /&gt;
  6  Byte line&lt;br /&gt;
&lt;br /&gt;
== Palette ==&lt;br /&gt;
&lt;br /&gt;
The palette is encoded as 12-bit RGB values (4 bits each of R, G, and B) stored in 16-bit words with the upper 4 bits unused and set to 0.  The palette size field in the header is the number of bytes in the palette, i.e. double the number of entries in the palette.&lt;br /&gt;
&lt;br /&gt;
== Audio ==&lt;br /&gt;
&lt;br /&gt;
Audio is encoded in standard uncompressed signed 8-bit PCM.  The CDXL file itself does not seem to contain any sampling rate information, although related documents suggest 11025 Hz is standard.&lt;br /&gt;
&lt;br /&gt;
== Video ==&lt;br /&gt;
&lt;br /&gt;
Video is encoded differently based on the info byte in the header.&lt;br /&gt;
&lt;br /&gt;
RGB is encoded as an index into the palette for the current frame.&lt;br /&gt;
&lt;br /&gt;
HAM (Hold-And-Modify), an Amiga-specific video mode, is encoded as described on [http://en.wikipedia.org/wiki/Hold-And-Modify Wikipedia].&lt;br /&gt;
The sample at http://samples.mplayerhq.hu/game-formats/cdxl/amigaball.cdxl at least seems to use the HAM6 method, since it has 32 bytes&lt;br /&gt;
of palette (i.e. 16 palette entries).&lt;br /&gt;
&lt;br /&gt;
[[Category:Container Formats]]&lt;br /&gt;
[[Category:Video Codecs]]&lt;br /&gt;
[[Category:Incomplete Video Codecs]]&lt;/div&gt;</summary>
		<author><name>Reimar</name></author>
	</entry>
	<entry>
		<id>https://wiki.multimedia.cx/index.php?title=FFmpeg_Summer_Of_Code_2009&amp;diff=11370</id>
		<title>FFmpeg Summer Of Code 2009</title>
		<link rel="alternate" type="text/html" href="https://wiki.multimedia.cx/index.php?title=FFmpeg_Summer_Of_Code_2009&amp;diff=11370"/>
		<updated>2009-03-20T09:51:44Z</updated>

		<summary type="html">&lt;p&gt;Reimar: /* AACS implementation */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Current Status ==&lt;br /&gt;
&lt;br /&gt;
This list is still a work-in-progress, please see also the [[Talk:FFmpeg Summer Of Code 2009|Talk Page]].&lt;br /&gt;
&lt;br /&gt;
== Qualification tasks ==&lt;br /&gt;
&lt;br /&gt;
For us to consider your application for SoC we require a completed qualification task. Choose a task from the [[Small FFmpeg Tasks|Small Tasks list]], send an email to FFmpeg-devel mailing list to inform that you are working on it (to avoid duplicated work) and when it is ready submit it for review at FFmpeg-devel. The task is considered completed when your patch is accepted to our main SVN tree.&lt;br /&gt;
&lt;br /&gt;
== 1st Tier Project Proposals ==&lt;br /&gt;
1st tier project proposals are project ideas that are reasonably well defined '''AND''' have a mentor volunteered.&lt;br /&gt;
&lt;br /&gt;
=== S/PDIF muxer ===&lt;br /&gt;
* Implement a muxer capable to mux:&lt;br /&gt;
** DTS, all 3 packing modes and the usable HD extensions&lt;br /&gt;
** AC3, eAC3 also&lt;br /&gt;
** MLP&lt;br /&gt;
** PCM&lt;br /&gt;
** WMApro&lt;br /&gt;
** AAC&lt;br /&gt;
** Mpeg-audio, layer 2 and 3&lt;br /&gt;
&lt;br /&gt;
Implement support in ffplay so that it is possible to output the audio stream over S/PDIF when playing a media file.&lt;br /&gt;
&lt;br /&gt;
''Mentor: Benjamin Larsson''&lt;br /&gt;
&lt;br /&gt;
=== Flash Screen video 2 codec ===&lt;br /&gt;
* Implement a flashsv2 decoder and encoder. And extend the current flashsv encoder to support optimal 2-pass encoding.&lt;br /&gt;
''Mentor: Benjamin Larsson''&lt;br /&gt;
&lt;br /&gt;
=== MPEG-4 ALS decoder ===&lt;br /&gt;
*primary goal: stream copy of ALS frames in MP4 files from reference encoder&lt;br /&gt;
** detect codec_id&lt;br /&gt;
** preserve extradata&lt;br /&gt;
*primary goal: write the decoder based on the ISO specification&lt;br /&gt;
** ISO/IEC 14496-3:2005/Amd.2:2006 and related corrigenda&lt;br /&gt;
*primary goal: decode files with basic ALS features&lt;br /&gt;
** integer samples&lt;br /&gt;
** LPC&lt;br /&gt;
** rice coding&lt;br /&gt;
** joint-stereo&lt;br /&gt;
*secondary goal: decode files with more advanced ALS features&lt;br /&gt;
** floating-point samples&lt;br /&gt;
** block switching&lt;br /&gt;
** LTP (long term prediction)&lt;br /&gt;
** BGMC (arithmetic coding)&lt;br /&gt;
** MCC (advanced multi-channel)&lt;br /&gt;
** RLSLMS (backward-adaptive prediction)&lt;br /&gt;
*secondary goal: pass the ISO conformance tests&lt;br /&gt;
*secondary goal: handle anything the reference encoder can come up with&lt;br /&gt;
''Mentor: Justin Ruggles''&lt;br /&gt;
&lt;br /&gt;
=== Playlist/Concatenation Support ===&lt;br /&gt;
*primary goal: implement a playlist/concatenation interface&lt;br /&gt;
to transcode(FFmpeg) and play(FFplay) media&lt;br /&gt;
** interface will use commandline switches.&lt;br /&gt;
** interface must support every input format FFmpeg support&lt;br /&gt;
** interface must work with different input stream parameters (different formats, codecs, video resolution, audio sample rate, audio channels, etc..)&lt;br /&gt;
** interface must support track selection&lt;br /&gt;
** interface must support existing playlist format files .m3u, .pls, xpsf.&lt;br /&gt;
&lt;br /&gt;
''Mentor: Baptiste Coudurier''&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== NEW Seeking API ===&lt;br /&gt;
*primary goal: implement a new seeking API in libavformat&lt;br /&gt;
** implement av_seek_file in libavformat&lt;br /&gt;
** implement compatible new seek_file for all AVInputFormat porting existing seek function if possible.&lt;br /&gt;
** implement av_build_index function which will build an AVIndex for the file&lt;br /&gt;
** implement av_export_index function which will save AVIndex in a file which can be loaded later.&lt;br /&gt;
&lt;br /&gt;
''Mentor: Baptiste Coudurier''&lt;br /&gt;
&lt;br /&gt;
=== Improve RTSP/RTP layer ===&lt;br /&gt;
*primary goal: improve the receiver compatibility&lt;br /&gt;
** Add support for more widespread formats ([list will follow check gst live555 and feng])&lt;br /&gt;
*** X-Qt/quicktime depayloader (see [http://www.gnome.org/~rbultje/ffmpeg-patchset/ X-QT patch])&lt;br /&gt;
*** vorbis and theora depayloader (see [[Small_FFmpeg_Tasks#Implement_the_RTP.2FVorbis_payload]])&lt;br /&gt;
*** h263 and h263+ (see [http://roundup.ffmpeg.org/roundup/ffmpeg/issue678 Issue 678])&lt;br /&gt;
*** ...more...&lt;br /&gt;
** support Quicktime http tunnel mode &lt;br /&gt;
*secondary goal: provide an API to expose the rtcp layer (and the equivalent in RDT dialect)&lt;br /&gt;
*secondary goal: try to support subtitle streams (either as rtcp-xr or application/text stream)&lt;br /&gt;
*secondary goal: make VideoLanClient, MPlayer and Xine use ffmpeg rtsp&lt;br /&gt;
&lt;br /&gt;
''Mentor: Luca Barbato, Ronald S. Bultje''&lt;br /&gt;
&lt;br /&gt;
=== AACS implementation ===&lt;br /&gt;
* Add the ability to encode and decode using Advanced Access Content System to FFmpeg.&lt;br /&gt;
* Specifications: http://www.aacsla.com/specifications/&lt;br /&gt;
* existing implementation e.g. DumpHD: http://forum.doom9.org/showthread.php?t=123111&lt;br /&gt;
* Most parts (BD-J, MKB, title key generation) probably do not belong into FFmpeg, this should be discussed with us before submitting an application&lt;br /&gt;
** possible solution: only implement &amp;quot;lowest&amp;quot; level (decode given the correct title key) but implement CSS en- and decryption as secondary goal&lt;br /&gt;
&lt;br /&gt;
''Mentor: Reimar Döffinger''&lt;br /&gt;
&lt;br /&gt;
== 2nd Tier Project Proposals ==&lt;br /&gt;
All that separates these proposals from their 1st tier brethren is a mentor.&lt;br /&gt;
&lt;br /&gt;
=== Finish SoC projects from previous years ===&lt;br /&gt;
Some projects are lingering in the dark unfinished. They should be picked up and made ready for inclusion. These projects are potentially less involved than starting from scratch, but also more useful for FFmpeg since the probability that the projects get finished should be higher. If some of them are deemed too easy, they could be combined.&lt;br /&gt;
&lt;br /&gt;
Unfinished projects from previous years are:&lt;br /&gt;
&lt;br /&gt;
2006:&lt;br /&gt;
* AMR-NB decoder&lt;br /&gt;
&lt;br /&gt;
2007:&lt;br /&gt;
* QCELP decoder (missing features)&lt;br /&gt;
* JPEG 2000 decoder&lt;br /&gt;
* JPEG 2000 encoder&lt;br /&gt;
* Dirac decoder&lt;br /&gt;
* Dirac encoder&lt;br /&gt;
* TS muxer&lt;br /&gt;
&lt;br /&gt;
2008:&lt;br /&gt;
* Generic frame-level multithreading support&lt;br /&gt;
* AAC-LC encoder&lt;br /&gt;
* MLP/TrueHD encoder&lt;br /&gt;
* WMA Pro decoder&lt;br /&gt;
&lt;br /&gt;
=== [[Libavfilter]] video work ===&lt;br /&gt;
Libavfilter is the FFmpeg filtering library that started as a 2007 SoC [[FFmpeg Summer Of Code#Video Filter API (AKA libavfilter)|project]]. It should replace the now removed vhook subsystem. Most of it is already part of the FFmpeg main source tree, but there a few bits remaining. This project would consist in the following tasks&lt;br /&gt;
&lt;br /&gt;
* Get the remaining bits of the SoC tree committed, including the ffmpeg.c and ffplay.c patch&lt;br /&gt;
* Get libavfilter enabled in the main SVN tree&lt;br /&gt;
* Write a watermark filter (this is one of the most commonly requested FFmpeg feature)&lt;br /&gt;
* Write a expand/pad filter (see [http://thread.gmane.org/gmane.comp.video.ffmpeg.devel/85015] and [http://thread.gmane.org/gmane.comp.video.ffmpeg.soc/2779/]&lt;br /&gt;
* Port all MPlayer filters at libmbcodec/vf_* (do not forget asking the authors if it is ok to release them under the LGPL)&lt;br /&gt;
&lt;br /&gt;
see also this ffmpeg-devel message: [http://lists.mplayerhq.hu/pipermail/ffmpeg-devel/2009-March/064817.html]&lt;br /&gt;
&lt;br /&gt;
=== [[Libavfilter]] audio work ===&lt;br /&gt;
At the moment, FFmpeg filtering library has no support at all for handling audio. This task would consist of&lt;br /&gt;
&lt;br /&gt;
* Expanding the libavfilter framework to work with audio&lt;br /&gt;
* Writing a resampling filter (starting with just a wrapping code at libavcodec/audioconvert.c)&lt;br /&gt;
* Implement negotiation of sample format and number of channels analogously to the libavfilter colorspace negotiation&lt;br /&gt;
* Make the resampling filter works for several combinations of sample format and channels&lt;br /&gt;
* Write a visualization filter as proof-of-concept of a filter that works with both video and audio&lt;br /&gt;
&lt;br /&gt;
=== Implement a better regressions test system ===&lt;br /&gt;
* Split up the current regtests&lt;br /&gt;
* Add tests for all the missing formats and codecs to FATE&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== libvo ===&lt;br /&gt;
* Port MPlayer's libvo to ffplay&lt;br /&gt;
* Note that this does not just mean to produce a working hack so that ffplay can use xv, but a clean and acceptable wrapper for (most of) libvo.&lt;br /&gt;
&lt;br /&gt;
=== GStreamer input ===&lt;br /&gt;
* Like we have vfw input we could have a gstreamer input format also. This would enable support of wmapro and wmalossless until these formats are RE'd.&lt;br /&gt;
''Mentor: Christian Schaller''&lt;br /&gt;
&lt;br /&gt;
=== AMR-WB Decoder ===&lt;br /&gt;
* Specification: http://www.3gpp.org/ftp/Specs/html-info/26-series.htm&lt;br /&gt;
* Samples: http://samples.mplayerhq.hu/A-codecs/amr/&lt;br /&gt;
Also see [[AMR]].&lt;br /&gt;
&lt;br /&gt;
=== GSM Decoder ===&lt;br /&gt;
* Specification + sample implementation: http://kbs.cs.tu-berlin.de/~jutta/toast.html&lt;br /&gt;
* Samples: http://samples.mplayerhq.hu/A-codecs/GSM/&lt;br /&gt;
Also see [[GSM]].&lt;br /&gt;
&lt;br /&gt;
=== Sipr Decoder ===&lt;br /&gt;
* Specification: will be provided&lt;br /&gt;
* Samples: http://samples.mplayerhq.hu/real/AC-sipr/&lt;br /&gt;
Also see [[RealAudio sipr]] and [[Interesting_Patches#RealAudio_SIPR_.4016k_decoder_and_demuxer_by_Vladimir_Voroshilov|this patch]].&lt;br /&gt;
&lt;br /&gt;
=== Speex Decoder ===&lt;br /&gt;
* Specification:  http://speex.org/docs/&lt;br /&gt;
* Samples: http://samples.mplayerhq.hu/A-codecs/speex/&lt;br /&gt;
Also see [[Speex]].&lt;br /&gt;
&lt;br /&gt;
=== AMR-NB Encoder ===&lt;br /&gt;
* Specification: http://www.3gpp.org/ftp/Specs/html-info/26-series.htm&lt;br /&gt;
* Samples: http://samples.mplayerhq.hu/A-codecs/amr/&lt;br /&gt;
Also see [[AMR]].&lt;br /&gt;
&lt;br /&gt;
=== VP6 Encoder ===&lt;br /&gt;
* Specification: [[On2 VP6]]&lt;br /&gt;
* Samples: http://samples.mplayerhq.hu/V-codecs/VP6/&lt;br /&gt;
&lt;br /&gt;
=== WMV3 Encoder ===&lt;br /&gt;
* Clearly defined task&lt;br /&gt;
* Primary goal: Encode video sequences such that they can be decoded by a Windows Media player.&lt;br /&gt;
&lt;br /&gt;
This could either be done by improving [[Interesting Patches#WMV3 encoder by Denis Fortin|this patch]] or by writing the encoder from scratch.&lt;br /&gt;
&lt;br /&gt;
=== Improve subtitle support ===&lt;br /&gt;
&lt;br /&gt;
* Add text-to-bitmap conversion functions&lt;br /&gt;
* One with hard-coded bitmaps for characters&lt;br /&gt;
* One that utilizes freetype&lt;br /&gt;
* Function used will be chosen upon compilation&lt;br /&gt;
&lt;br /&gt;
Adjust existing subtitle support to new ABI&lt;br /&gt;
* ABI change: http://lists.mplayerhq.hu/pipermail/ffmpeg-devel/2009-January/058521.html&lt;br /&gt;
&lt;br /&gt;
=== VC-1 Interlaced Support ===&lt;br /&gt;
* Add support for interlaced streams as used in Bluray recordings to the VC-1 decoder.&lt;br /&gt;
* This includes fixing some reference streams&lt;br /&gt;
&lt;br /&gt;
=== Improve Ratecontrol ===&lt;br /&gt;
*Primary goal 1: Fast heuristic VBV compliant per macroblock ratecontrol which has a better PSNR/bitrate and better subjective quality/bitrate than the current code. &lt;br /&gt;
*Primary goal 2: VBV compliant, rate distortion optimal per macroblock ratecontrol using the viterbi algorithm. &lt;br /&gt;
*Secondary goal 1: Fast heuristic scene change detection which detects scene changes more accurately, has better PSNR/bitrate and subjective quality/bitrate than the current heuristic. &lt;br /&gt;
*Secondary goal 2: Rate distortion optimal (for the current picture) scene change detection. &lt;br /&gt;
*Secondary goal 3: B frames decision which is faster and or has a higher PSNR/bitrate and subjective quality/bitrate than the current code.&lt;br /&gt;
&lt;br /&gt;
=== WMA lossless ===&lt;br /&gt;
* Implement a decoder for WMA lossless (0x0163)&lt;br /&gt;
* Reuse as much libavcodec code as possible&lt;br /&gt;
* Samples: http://samples.mplayerhq.hu/A-codecs/WMA9/wma_0x163.wma http://samples.mplayerhq.hu/A-codecs/lossless/luckynight.wma&lt;br /&gt;
&lt;br /&gt;
=== WTV (de)muxer ===&lt;br /&gt;
* Implement a demuxer (and possibly a muxer) for the [[WTV]] file format.&lt;/div&gt;</summary>
		<author><name>Reimar</name></author>
	</entry>
	<entry>
		<id>https://wiki.multimedia.cx/index.php?title=FFmpeg_Summer_Of_Code_2009&amp;diff=11359</id>
		<title>FFmpeg Summer Of Code 2009</title>
		<link rel="alternate" type="text/html" href="https://wiki.multimedia.cx/index.php?title=FFmpeg_Summer_Of_Code_2009&amp;diff=11359"/>
		<updated>2009-03-19T18:52:37Z</updated>

		<summary type="html">&lt;p&gt;Reimar: DumpHD AACS implementation&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Current Status ==&lt;br /&gt;
&lt;br /&gt;
This list is still a work-in-progress, please see also the [[Talk:FFmpeg Summer Of Code 2009|Talk Page]].&lt;br /&gt;
&lt;br /&gt;
== Qualification tasks ==&lt;br /&gt;
&lt;br /&gt;
For us to consider your application for SoC we require a completed qualification task. Choose a task from the [[Small FFmpeg Tasks|Small Tasks list]], send an email to FFmpeg-devel mailing list to inform that you are working on it (to avoid duplicated work) and when it is ready submit it for review at FFmpeg-devel. The task is considered completed when your patch is accepted to our main SVN tree.&lt;br /&gt;
&lt;br /&gt;
== 1st Tier Project Proposals ==&lt;br /&gt;
1st tier project proposals are project ideas that are reasonably well defined '''AND''' have a mentor volunteered.&lt;br /&gt;
&lt;br /&gt;
=== S/PDIF muxer ===&lt;br /&gt;
* Implement a muxer capable to mux:&lt;br /&gt;
** DTS, all 3 packing modes and the usable HD extensions&lt;br /&gt;
** AC3, eAC3 also&lt;br /&gt;
** MLP&lt;br /&gt;
** PCM&lt;br /&gt;
** WMApro&lt;br /&gt;
** AAC&lt;br /&gt;
** Mpeg-audio, layer 2 and 3&lt;br /&gt;
&lt;br /&gt;
Implement support in ffplay so that it is possible to output the audio stream over S/PDIF when playing a media file.&lt;br /&gt;
&lt;br /&gt;
''Mentor: Benjamin Larsson''&lt;br /&gt;
&lt;br /&gt;
=== Flash Screen video 2 codec ===&lt;br /&gt;
* Implement a flashsv2 decoder and encoder. And extend the current flashsv encoder to support optimal 2-pass encoding.&lt;br /&gt;
''Mentor: Benjamin Larsson''&lt;br /&gt;
&lt;br /&gt;
=== MPEG-4 ALS decoder ===&lt;br /&gt;
*primary goal: stream copy of ALS frames in MP4 files from reference encoder&lt;br /&gt;
** detect codec_id&lt;br /&gt;
** preserve extradata&lt;br /&gt;
*primary goal: write the decoder based on the ISO specification&lt;br /&gt;
** ISO/IEC 14496-3:2005/Amd.2:2006 and related corrigenda&lt;br /&gt;
*primary goal: decode files with basic ALS features&lt;br /&gt;
** integer samples&lt;br /&gt;
** LPC&lt;br /&gt;
** rice coding&lt;br /&gt;
** joint-stereo&lt;br /&gt;
*secondary goal: decode files with more advanced ALS features&lt;br /&gt;
** floating-point samples&lt;br /&gt;
** block switching&lt;br /&gt;
** LTP (long term prediction)&lt;br /&gt;
** BGMC (arithmetic coding)&lt;br /&gt;
** MCC (advanced multi-channel)&lt;br /&gt;
** RLSLMS (backward-adaptive prediction)&lt;br /&gt;
*secondary goal: pass the ISO conformance tests&lt;br /&gt;
*secondary goal: handle anything the reference encoder can come up with&lt;br /&gt;
''Mentor: Justin Ruggles''&lt;br /&gt;
&lt;br /&gt;
=== Playlist/Concatenation Support ===&lt;br /&gt;
*primary goal: implement a playlist/concatenation interface&lt;br /&gt;
to transcode(FFmpeg) and play(FFplay) media&lt;br /&gt;
** interface will use commandline switches.&lt;br /&gt;
** interface must support every input format FFmpeg support&lt;br /&gt;
** interface must work with different input stream parameters (different formats, codecs, video resolution, audio sample rate, audio channels, etc..)&lt;br /&gt;
** interface must support track selection&lt;br /&gt;
** interface must support existing playlist format files .m3u, .pls, xpsf.&lt;br /&gt;
&lt;br /&gt;
''Mentor: Baptiste Coudurier''&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== NEW Seeking API ===&lt;br /&gt;
*primary goal: implement a new seeking API in libavformat&lt;br /&gt;
** implement av_seek_file in libavformat&lt;br /&gt;
** implement compatible new seek_file for all AVInputFormat porting existing seek function if possible.&lt;br /&gt;
** implement av_build_index function which will build an AVIndex for the file&lt;br /&gt;
** implement av_export_index function which will save AVIndex in a file which can be loaded later.&lt;br /&gt;
&lt;br /&gt;
''Mentor: Baptiste Coudurier''&lt;br /&gt;
&lt;br /&gt;
=== Improve RTSP/RTP layer ===&lt;br /&gt;
*primary goal: improve the receiver compatibility&lt;br /&gt;
** Add support for more widespread formats ([list will follow check gst live555 and feng])&lt;br /&gt;
*** X-Qt/quicktime depayloader (see [http://www.gnome.org/~rbultje/ffmpeg-patchset/ X-QT patch])&lt;br /&gt;
*** vorbis and theora depayloader (see [[Small_FFmpeg_Tasks#Implement_the_RTP.2FVorbis_payload]])&lt;br /&gt;
*** h263 and h263+ (see [http://roundup.ffmpeg.org/roundup/ffmpeg/issue678 Issue 678])&lt;br /&gt;
*** ...more...&lt;br /&gt;
** support Quicktime http tunnel mode &lt;br /&gt;
*secondary goal: provide an API to expose the rtcp layer (and the equivalent in RDT dialect)&lt;br /&gt;
*secondary goal: try to support subtitle streams (either as rtcp-xr or application/text stream)&lt;br /&gt;
*secondary goal: make VideoLanClient, MPlayer and Xine use ffmpeg rtsp&lt;br /&gt;
&lt;br /&gt;
''Mentor: Luca Barbato, Ronald S. Bultje''&lt;br /&gt;
&lt;br /&gt;
== 2nd Tier Project Proposals ==&lt;br /&gt;
All that separates these proposals from their 1st tier brethren is a mentor.&lt;br /&gt;
&lt;br /&gt;
=== [[Libavfilter]] video work ===&lt;br /&gt;
Libavfilter is the FFmpeg filtering library that started as a 2007 SoC [[FFmpeg Summer Of Code#Video Filter API (AKA libavfilter)|project]]. It should replace the now removed vhook subsystem. Most of it is already part of the FFmpeg main source tree, but there a few bits remaining. This project would consist in the following tasks&lt;br /&gt;
&lt;br /&gt;
* Get the remaining bits of the SoC tree committed, including the ffmpeg.c and ffplay.c patch&lt;br /&gt;
* Get libavfilter enabled in the main SVN tree&lt;br /&gt;
* Write a watermark filter (this is one of the most commonly requested FFmpeg feature)&lt;br /&gt;
* Write a expand/pad filter (see [http://thread.gmane.org/gmane.comp.video.ffmpeg.devel/85015] and [http://thread.gmane.org/gmane.comp.video.ffmpeg.soc/2779/]&lt;br /&gt;
* Port all MPlayer filters at libmbcodec/vf_* (do not forget asking the authors if it is ok to release them under the LGPL)&lt;br /&gt;
&lt;br /&gt;
see also this ffmpeg-devel message: [http://lists.mplayerhq.hu/pipermail/ffmpeg-devel/2009-March/064817.html]&lt;br /&gt;
&lt;br /&gt;
=== [[Libavfilter]] audio work ===&lt;br /&gt;
At the moment, FFmpeg filtering library has no support at all for handling audio. This task would consist of&lt;br /&gt;
&lt;br /&gt;
* Expanding the libavfilter framework to work with audio&lt;br /&gt;
* Writing a resampling filter (starting with just a wrapping code at libavcodec/audioconvert.c)&lt;br /&gt;
* Implement negotiation of sample format and number of channels analogously to the libavfilter colorspace negotiation&lt;br /&gt;
* Make the resampling filter works for several combinations of sample format and channels&lt;br /&gt;
* Write a visualization filter as proof-of-concept of a filter that works with both video and audio&lt;br /&gt;
&lt;br /&gt;
=== Implement a better regressions test system ===&lt;br /&gt;
* Split up the current regtests&lt;br /&gt;
* Add tests for all the missing formats and codecs to FATE&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== libvo ===&lt;br /&gt;
* Port MPlayer's libvo to ffplay&lt;br /&gt;
* Note that this does not just mean to produce a working hack so that ffplay can use xv, but a clean and acceptable wrapper for (most of) libvo.&lt;br /&gt;
&lt;br /&gt;
=== GStreamer input ===&lt;br /&gt;
* Like we have vfw input we could have a gstreamer input format also. This would enable support of wmapro and wmalossless until these formats are RE'd.&lt;br /&gt;
''Mentor: Christian Schaller''&lt;br /&gt;
&lt;br /&gt;
=== AMR-WB Decoder ===&lt;br /&gt;
* Specification: http://www.3gpp.org/ftp/Specs/html-info/26-series.htm&lt;br /&gt;
* Samples: http://samples.mplayerhq.hu/A-codecs/amr/&lt;br /&gt;
Also see [[AMR]].&lt;br /&gt;
&lt;br /&gt;
=== GSM Decoder ===&lt;br /&gt;
* Specification + sample implementation: http://kbs.cs.tu-berlin.de/~jutta/toast.html&lt;br /&gt;
* Samples: http://samples.mplayerhq.hu/A-codecs/GSM/&lt;br /&gt;
Also see [[GSM]].&lt;br /&gt;
&lt;br /&gt;
=== Sipr Decoder ===&lt;br /&gt;
* Specification: will be provided&lt;br /&gt;
* Samples: http://samples.mplayerhq.hu/real/AC-sipr/&lt;br /&gt;
Also see [[RealAudio sipr]] and [[Interesting_Patches#RealAudio_SIPR_.4016k_decoder_and_demuxer_by_Vladimir_Voroshilov|this patch]].&lt;br /&gt;
&lt;br /&gt;
=== Speex Decoder ===&lt;br /&gt;
* Specification:  http://speex.org/docs/&lt;br /&gt;
* Samples: http://samples.mplayerhq.hu/A-codecs/speex/&lt;br /&gt;
Also see [[Speex]].&lt;br /&gt;
&lt;br /&gt;
=== AMR-NB Encoder ===&lt;br /&gt;
* Specification: http://www.3gpp.org/ftp/Specs/html-info/26-series.htm&lt;br /&gt;
* Samples: http://samples.mplayerhq.hu/A-codecs/amr/&lt;br /&gt;
Also see [[AMR]].&lt;br /&gt;
&lt;br /&gt;
=== VP6 Encoder ===&lt;br /&gt;
* Specification: [[On2 VP6]]&lt;br /&gt;
* Samples: http://samples.mplayerhq.hu/V-codecs/VP6/&lt;br /&gt;
&lt;br /&gt;
=== WMV3 Encoder ===&lt;br /&gt;
* Clearly defined task&lt;br /&gt;
* Primary goal: Encode video sequences such that they can be decoded by a Windows Media player.&lt;br /&gt;
&lt;br /&gt;
This could either be done by improving [[Interesting Patches#WMV3 encoder by Denis Fortin|this patch]] or by writing the encoder from scratch.&lt;br /&gt;
&lt;br /&gt;
=== Improve subtitle support ===&lt;br /&gt;
&lt;br /&gt;
* Add text-to-bitmap conversion functions&lt;br /&gt;
* One with hard-coded bitmaps for characters&lt;br /&gt;
* One that utilizes freetype&lt;br /&gt;
* Function used will be chosen upon compilation&lt;br /&gt;
&lt;br /&gt;
Adjust existing subtitle support to new ABI&lt;br /&gt;
* ABI change: http://lists.mplayerhq.hu/pipermail/ffmpeg-devel/2009-January/058521.html&lt;br /&gt;
&lt;br /&gt;
=== AACS implementation ===&lt;br /&gt;
* Add the ability to encode and decode using Advanced Access Content System to FFmpeg.&lt;br /&gt;
* Specifications: http://www.aacsla.com/specifications/&lt;br /&gt;
* existing implementation e.g. DumpHD: http://forum.doom9.org/showthread.php?t=123111&lt;br /&gt;
* Most parts (BD-J, MKB, title key generation) probably do not belong into FFmpeg, this should be discussed with us before submitting an application&lt;br /&gt;
&lt;br /&gt;
''Mentor: Reimar Döffinger''&lt;br /&gt;
&lt;br /&gt;
=== VC-1 Interlaced Support ===&lt;br /&gt;
* Add support for interlaced streams as used in Bluray recordings to the VC-1 decoder.&lt;br /&gt;
* This includes fixing some reference streams&lt;br /&gt;
&lt;br /&gt;
=== Improve Ratecontrol ===&lt;br /&gt;
*Primary goal 1: Fast heuristic VBV compliant per macroblock ratecontrol which has a better PSNR/bitrate and better subjective quality/bitrate than the current code. &lt;br /&gt;
*Primary goal 2: VBV compliant, rate distortion optimal per macroblock ratecontrol using the viterbi algorithm. &lt;br /&gt;
*Secondary goal 1: Fast heuristic scene change detection which detects scene changes more accurately, has better PSNR/bitrate and subjective quality/bitrate than the current heuristic. &lt;br /&gt;
*Secondary goal 2: Rate distortion optimal (for the current picture) scene change detection. &lt;br /&gt;
*Secondary goal 3: B frames decision which is faster and or has a higher PSNR/bitrate and subjective quality/bitrate than the current code.&lt;br /&gt;
&lt;br /&gt;
=== WMA lossless ===&lt;br /&gt;
* Implement a decoder for WMA lossless (0x0163)&lt;br /&gt;
* Reuse as much libavcodec code as possible&lt;br /&gt;
* Samples: http://samples.mplayerhq.hu/A-codecs/WMA9/wma_0x163.wma http://samples.mplayerhq.hu/A-codecs/lossless/luckynight.wma&lt;br /&gt;
&lt;br /&gt;
=== WTV (de)muxer ===&lt;br /&gt;
* Implement a demuxer (and possibly a muxer) for the [[WTV]] file format.&lt;/div&gt;</summary>
		<author><name>Reimar</name></author>
	</entry>
	<entry>
		<id>https://wiki.multimedia.cx/index.php?title=FFmpeg_Summer_Of_Code_2009&amp;diff=11358</id>
		<title>FFmpeg Summer Of Code 2009</title>
		<link rel="alternate" type="text/html" href="https://wiki.multimedia.cx/index.php?title=FFmpeg_Summer_Of_Code_2009&amp;diff=11358"/>
		<updated>2009-03-19T18:50:41Z</updated>

		<summary type="html">&lt;p&gt;Reimar: /* AACS implementation */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Current Status ==&lt;br /&gt;
&lt;br /&gt;
This list is still a work-in-progress, please see also the [[Talk:FFmpeg Summer Of Code 2009|Talk Page]].&lt;br /&gt;
&lt;br /&gt;
== Qualification tasks ==&lt;br /&gt;
&lt;br /&gt;
For us to consider your application for SoC we require a completed qualification task. Choose a task from the [[Small FFmpeg Tasks|Small Tasks list]], send an email to FFmpeg-devel mailing list to inform that you are working on it (to avoid duplicated work) and when it is ready submit it for review at FFmpeg-devel. The task is considered completed when your patch is accepted to our main SVN tree.&lt;br /&gt;
&lt;br /&gt;
== 1st Tier Project Proposals ==&lt;br /&gt;
1st tier project proposals are project ideas that are reasonably well defined '''AND''' have a mentor volunteered.&lt;br /&gt;
&lt;br /&gt;
=== S/PDIF muxer ===&lt;br /&gt;
* Implement a muxer capable to mux:&lt;br /&gt;
** DTS, all 3 packing modes and the usable HD extensions&lt;br /&gt;
** AC3, eAC3 also&lt;br /&gt;
** MLP&lt;br /&gt;
** PCM&lt;br /&gt;
** WMApro&lt;br /&gt;
** AAC&lt;br /&gt;
** Mpeg-audio, layer 2 and 3&lt;br /&gt;
&lt;br /&gt;
Implement support in ffplay so that it is possible to output the audio stream over S/PDIF when playing a media file.&lt;br /&gt;
&lt;br /&gt;
''Mentor: Benjamin Larsson''&lt;br /&gt;
&lt;br /&gt;
=== Flash Screen video 2 codec ===&lt;br /&gt;
* Implement a flashsv2 decoder and encoder. And extend the current flashsv encoder to support optimal 2-pass encoding.&lt;br /&gt;
''Mentor: Benjamin Larsson''&lt;br /&gt;
&lt;br /&gt;
=== MPEG-4 ALS decoder ===&lt;br /&gt;
*primary goal: stream copy of ALS frames in MP4 files from reference encoder&lt;br /&gt;
** detect codec_id&lt;br /&gt;
** preserve extradata&lt;br /&gt;
*primary goal: write the decoder based on the ISO specification&lt;br /&gt;
** ISO/IEC 14496-3:2005/Amd.2:2006 and related corrigenda&lt;br /&gt;
*primary goal: decode files with basic ALS features&lt;br /&gt;
** integer samples&lt;br /&gt;
** LPC&lt;br /&gt;
** rice coding&lt;br /&gt;
** joint-stereo&lt;br /&gt;
*secondary goal: decode files with more advanced ALS features&lt;br /&gt;
** floating-point samples&lt;br /&gt;
** block switching&lt;br /&gt;
** LTP (long term prediction)&lt;br /&gt;
** BGMC (arithmetic coding)&lt;br /&gt;
** MCC (advanced multi-channel)&lt;br /&gt;
** RLSLMS (backward-adaptive prediction)&lt;br /&gt;
*secondary goal: pass the ISO conformance tests&lt;br /&gt;
*secondary goal: handle anything the reference encoder can come up with&lt;br /&gt;
''Mentor: Justin Ruggles''&lt;br /&gt;
&lt;br /&gt;
=== Playlist/Concatenation Support ===&lt;br /&gt;
*primary goal: implement a playlist/concatenation interface&lt;br /&gt;
to transcode(FFmpeg) and play(FFplay) media&lt;br /&gt;
** interface will use commandline switches.&lt;br /&gt;
** interface must support every input format FFmpeg support&lt;br /&gt;
** interface must work with different input stream parameters (different formats, codecs, video resolution, audio sample rate, audio channels, etc..)&lt;br /&gt;
** interface must support track selection&lt;br /&gt;
** interface must support existing playlist format files .m3u, .pls, xpsf.&lt;br /&gt;
&lt;br /&gt;
''Mentor: Baptiste Coudurier''&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== NEW Seeking API ===&lt;br /&gt;
*primary goal: implement a new seeking API in libavformat&lt;br /&gt;
** implement av_seek_file in libavformat&lt;br /&gt;
** implement compatible new seek_file for all AVInputFormat porting existing seek function if possible.&lt;br /&gt;
** implement av_build_index function which will build an AVIndex for the file&lt;br /&gt;
** implement av_export_index function which will save AVIndex in a file which can be loaded later.&lt;br /&gt;
&lt;br /&gt;
''Mentor: Baptiste Coudurier''&lt;br /&gt;
&lt;br /&gt;
=== Improve RTSP/RTP layer ===&lt;br /&gt;
*primary goal: improve the receiver compatibility&lt;br /&gt;
** Add support for more widespread formats ([list will follow check gst live555 and feng])&lt;br /&gt;
*** X-Qt/quicktime depayloader (see [http://www.gnome.org/~rbultje/ffmpeg-patchset/ X-QT patch])&lt;br /&gt;
*** vorbis and theora depayloader (see [[Small_FFmpeg_Tasks#Implement_the_RTP.2FVorbis_payload]])&lt;br /&gt;
*** h263 and h263+ (see [http://roundup.ffmpeg.org/roundup/ffmpeg/issue678 Issue 678])&lt;br /&gt;
*** ...more...&lt;br /&gt;
** support Quicktime http tunnel mode &lt;br /&gt;
*secondary goal: provide an API to expose the rtcp layer (and the equivalent in RDT dialect)&lt;br /&gt;
*secondary goal: try to support subtitle streams (either as rtcp-xr or application/text stream)&lt;br /&gt;
*secondary goal: make VideoLanClient, MPlayer and Xine use ffmpeg rtsp&lt;br /&gt;
&lt;br /&gt;
''Mentor: Luca Barbato, Ronald S. Bultje''&lt;br /&gt;
&lt;br /&gt;
== 2nd Tier Project Proposals ==&lt;br /&gt;
All that separates these proposals from their 1st tier brethren is a mentor.&lt;br /&gt;
&lt;br /&gt;
=== [[Libavfilter]] video work ===&lt;br /&gt;
Libavfilter is the FFmpeg filtering library that started as a 2007 SoC [[FFmpeg Summer Of Code#Video Filter API (AKA libavfilter)|project]]. It should replace the now removed vhook subsystem. Most of it is already part of the FFmpeg main source tree, but there a few bits remaining. This project would consist in the following tasks&lt;br /&gt;
&lt;br /&gt;
* Get the remaining bits of the SoC tree committed, including the ffmpeg.c and ffplay.c patch&lt;br /&gt;
* Get libavfilter enabled in the main SVN tree&lt;br /&gt;
* Write a watermark filter (this is one of the most commonly requested FFmpeg feature)&lt;br /&gt;
* Write a expand/pad filter (see [http://thread.gmane.org/gmane.comp.video.ffmpeg.devel/85015] and [http://thread.gmane.org/gmane.comp.video.ffmpeg.soc/2779/]&lt;br /&gt;
* Port all MPlayer filters at libmbcodec/vf_* (do not forget asking the authors if it is ok to release them under the LGPL)&lt;br /&gt;
&lt;br /&gt;
see also this ffmpeg-devel message: [http://lists.mplayerhq.hu/pipermail/ffmpeg-devel/2009-March/064817.html]&lt;br /&gt;
&lt;br /&gt;
=== [[Libavfilter]] audio work ===&lt;br /&gt;
At the moment, FFmpeg filtering library has no support at all for handling audio. This task would consist of&lt;br /&gt;
&lt;br /&gt;
* Expanding the libavfilter framework to work with audio&lt;br /&gt;
* Writing a resampling filter (starting with just a wrapping code at libavcodec/audioconvert.c)&lt;br /&gt;
* Implement negotiation of sample format and number of channels analogously to the libavfilter colorspace negotiation&lt;br /&gt;
* Make the resampling filter works for several combinations of sample format and channels&lt;br /&gt;
* Write a visualization filter as proof-of-concept of a filter that works with both video and audio&lt;br /&gt;
&lt;br /&gt;
=== Implement a better regressions test system ===&lt;br /&gt;
* Split up the current regtests&lt;br /&gt;
* Add tests for all the missing formats and codecs to FATE&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== libvo ===&lt;br /&gt;
* Port MPlayer's libvo to ffplay&lt;br /&gt;
* Note that this does not just mean to produce a working hack so that ffplay can use xv, but a clean and acceptable wrapper for (most of) libvo.&lt;br /&gt;
&lt;br /&gt;
=== GStreamer input ===&lt;br /&gt;
* Like we have vfw input we could have a gstreamer input format also. This would enable support of wmapro and wmalossless until these formats are RE'd.&lt;br /&gt;
''Mentor: Christian Schaller''&lt;br /&gt;
&lt;br /&gt;
=== AMR-WB Decoder ===&lt;br /&gt;
* Specification: http://www.3gpp.org/ftp/Specs/html-info/26-series.htm&lt;br /&gt;
* Samples: http://samples.mplayerhq.hu/A-codecs/amr/&lt;br /&gt;
Also see [[AMR]].&lt;br /&gt;
&lt;br /&gt;
=== GSM Decoder ===&lt;br /&gt;
* Specification + sample implementation: http://kbs.cs.tu-berlin.de/~jutta/toast.html&lt;br /&gt;
* Samples: http://samples.mplayerhq.hu/A-codecs/GSM/&lt;br /&gt;
Also see [[GSM]].&lt;br /&gt;
&lt;br /&gt;
=== Sipr Decoder ===&lt;br /&gt;
* Specification: will be provided&lt;br /&gt;
* Samples: http://samples.mplayerhq.hu/real/AC-sipr/&lt;br /&gt;
Also see [[RealAudio sipr]] and [[Interesting_Patches#RealAudio_SIPR_.4016k_decoder_and_demuxer_by_Vladimir_Voroshilov|this patch]].&lt;br /&gt;
&lt;br /&gt;
=== Speex Decoder ===&lt;br /&gt;
* Specification:  http://speex.org/docs/&lt;br /&gt;
* Samples: http://samples.mplayerhq.hu/A-codecs/speex/&lt;br /&gt;
Also see [[Speex]].&lt;br /&gt;
&lt;br /&gt;
=== AMR-NB Encoder ===&lt;br /&gt;
* Specification: http://www.3gpp.org/ftp/Specs/html-info/26-series.htm&lt;br /&gt;
* Samples: http://samples.mplayerhq.hu/A-codecs/amr/&lt;br /&gt;
Also see [[AMR]].&lt;br /&gt;
&lt;br /&gt;
=== VP6 Encoder ===&lt;br /&gt;
* Specification: [[On2 VP6]]&lt;br /&gt;
* Samples: http://samples.mplayerhq.hu/V-codecs/VP6/&lt;br /&gt;
&lt;br /&gt;
=== WMV3 Encoder ===&lt;br /&gt;
* Clearly defined task&lt;br /&gt;
* Primary goal: Encode video sequences such that they can be decoded by a Windows Media player.&lt;br /&gt;
&lt;br /&gt;
This could either be done by improving [[Interesting Patches#WMV3 encoder by Denis Fortin|this patch]] or by writing the encoder from scratch.&lt;br /&gt;
&lt;br /&gt;
=== Improve subtitle support ===&lt;br /&gt;
&lt;br /&gt;
* Add text-to-bitmap conversion functions&lt;br /&gt;
* One with hard-coded bitmaps for characters&lt;br /&gt;
* One that utilizes freetype&lt;br /&gt;
* Function used will be chosen upon compilation&lt;br /&gt;
&lt;br /&gt;
Adjust existing subtitle support to new ABI&lt;br /&gt;
* ABI change: http://lists.mplayerhq.hu/pipermail/ffmpeg-devel/2009-January/058521.html&lt;br /&gt;
&lt;br /&gt;
=== AACS implementation ===&lt;br /&gt;
* Add the ability to encode and decode using Advanced Access Content System to FFmpeg.&lt;br /&gt;
* Specifications: http://www.aacsla.com/specifications/&lt;br /&gt;
* Most parts (BD-J, MKB, title key generation) probably do not belong into FFmpeg, this should be discussed with us before submitting an application&lt;br /&gt;
&lt;br /&gt;
''Mentor: Reimar Döffinger''&lt;br /&gt;
&lt;br /&gt;
=== VC-1 Interlaced Support ===&lt;br /&gt;
* Add support for interlaced streams as used in Bluray recordings to the VC-1 decoder.&lt;br /&gt;
* This includes fixing some reference streams&lt;br /&gt;
&lt;br /&gt;
=== Improve Ratecontrol ===&lt;br /&gt;
*Primary goal 1: Fast heuristic VBV compliant per macroblock ratecontrol which has a better PSNR/bitrate and better subjective quality/bitrate than the current code. &lt;br /&gt;
*Primary goal 2: VBV compliant, rate distortion optimal per macroblock ratecontrol using the viterbi algorithm. &lt;br /&gt;
*Secondary goal 1: Fast heuristic scene change detection which detects scene changes more accurately, has better PSNR/bitrate and subjective quality/bitrate than the current heuristic. &lt;br /&gt;
*Secondary goal 2: Rate distortion optimal (for the current picture) scene change detection. &lt;br /&gt;
*Secondary goal 3: B frames decision which is faster and or has a higher PSNR/bitrate and subjective quality/bitrate than the current code.&lt;br /&gt;
&lt;br /&gt;
=== WMA lossless ===&lt;br /&gt;
* Implement a decoder for WMA lossless (0x0163)&lt;br /&gt;
* Reuse as much libavcodec code as possible&lt;br /&gt;
* Samples: http://samples.mplayerhq.hu/A-codecs/WMA9/wma_0x163.wma http://samples.mplayerhq.hu/A-codecs/lossless/luckynight.wma&lt;br /&gt;
&lt;br /&gt;
=== WTV (de)muxer ===&lt;br /&gt;
* Implement a demuxer (and possibly a muxer) for the [[WTV]] file format.&lt;/div&gt;</summary>
		<author><name>Reimar</name></author>
	</entry>
	<entry>
		<id>https://wiki.multimedia.cx/index.php?title=Talk:FFmpeg_Summer_Of_Code_2009&amp;diff=11357</id>
		<title>Talk:FFmpeg Summer Of Code 2009</title>
		<link rel="alternate" type="text/html" href="https://wiki.multimedia.cx/index.php?title=Talk:FFmpeg_Summer_Of_Code_2009&amp;diff=11357"/>
		<updated>2009-03-19T18:26:48Z</updated>

		<summary type="html">&lt;p&gt;Reimar: comments on libvo/AACS tasks&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=== S/PDIF muxer ===&lt;br /&gt;
&lt;br /&gt;
Is there any specific qualification task you would like done for this? -- Jai&lt;br /&gt;
&lt;br /&gt;
:Working Jpeg2000 decoder ;), cleaning up this http://lists.mplayerhq.hu/pipermail/ffmpeg-devel/2005-June/001673.html would be welcome. It's a rpza encoder. --[[User:Merbanan|Merbanan]] 06:22, 31 December 2008 (EST)&lt;br /&gt;
&lt;br /&gt;
=== speex + gsm ===&lt;br /&gt;
&lt;br /&gt;
Aren't libgsm and libspeex distributed under a permissive license?&lt;br /&gt;
If yes, these tasks do not have very high priority, imo.&lt;br /&gt;
[[User:Ce|Ce]] 14:56, 11 January 2009 (EST)&lt;br /&gt;
&lt;br /&gt;
: That's not important. FFmpeg aims to support all multimedia formats. We have never let this argument stop us from implementing decoders. --[[User:DonDiego|DonDiego]] 08:22, 14 March 2009 (EDT)&lt;br /&gt;
&lt;br /&gt;
=== DTS-HD Master Audio decoder? ===&lt;br /&gt;
Would [http://en.wikipedia.org/wiki/DTS-HD_Master_Audio DTS-HD Master Audio] decoder make good project suggestion?  [[User:Gamester17|Gamester17]] 02:51, 16 January 2009 (EST)&lt;br /&gt;
&lt;br /&gt;
http://en.wikipedia.org/wiki/DTS-HD_Master_Audio&lt;br /&gt;
:&amp;quot;&amp;quot;''DTS-HD Master Audio is a lossless audio codec created by Digital Theater System. It was previously known as DTS++ and DTS-HD. It is an extension of DTS which, when played back on devices which do not support the Master Audio extension, degrades to a 1.5 Mbit/s &amp;quot;core&amp;quot; track which is lossy. DTS-HD Master Audio is an optional audio format for both Blu-ray Disc and HD DVD''&amp;quot;&lt;br /&gt;
&lt;br /&gt;
Specs, please. From what I know the projects without spec take a looong time to complete. --[[User:Kostya|Kostya]] 03:32, 16 January 2009 (EST)&lt;br /&gt;
&lt;br /&gt;
:AFAIK, there is even no software implementation, so it would be even more difficult;-( [[User:Ce|Ce]] 20:14, 16 January 2009 (EST)&lt;br /&gt;
&lt;br /&gt;
::How about then qualification task for at least distinguishing between normal DTS and DTS-HD Master Audio, to let the users know that the audio stream DTS-HD Master Audio but they are only getting normal DTS output out of it from FFmpeg? As today FFmpeg reports all as just &amp;quot;''dca''&amp;quot;, (I understand that [http://mediainfo.sourceforge.net MediaInfo] is an open source C++ project that is capable of distinguishing between normal DTS and DTS-HD Master Audio. [[User:Gamester17|Gamester17]] 12:20, 21 January 2009 (EST)&lt;br /&gt;
&lt;br /&gt;
=== WTV (Microsoft Windows Media Center Recording Format) demuxer? ===&lt;br /&gt;
Would a [[WTV|WTV (Microsoft Windows Media Center Recording Format)]] demuxer make good project suggestion? [[User:Gamester17|Gamester17]] 13:14, 16 January 2009 (EST)&lt;br /&gt;
&lt;br /&gt;
[[WTV|http://wiki.multimedia.cx/index.php?title=WTV]]&lt;br /&gt;
:&amp;quot;''WTV is the new container format used to record television shows in Microsoft Windows Vista Media Center starting with Windows Media Center TV Pack 2008.''&amp;quot;, &amp;quot;''WTV is the successor of DVR-MS which is is being replaced with WTV''&amp;quot;, &amp;quot;''WRT is also the default recording format for Windows 7 Media Center''&amp;quot;&lt;br /&gt;
&lt;br /&gt;
::This is tricky. It doesn't strike me as being involved enough to qualify as one of our usual SoC projects. OTOH, it seems a little too involved to be a qualification task. --[[User:Multimedia Mike|Multimedia Mike]] 14:24, 16 January 2009 (EST)&lt;br /&gt;
&lt;br /&gt;
::: That sounds great actually.  Maybe this could become one of the few SoC projects that are actually finished in time...--[[User:DonDiego|DonDiego]] 07:18, 18 January 2009 (EST)&lt;br /&gt;
&lt;br /&gt;
:::: +1! [[User:Ce|Ce]] 11:54, 18 January 2009 (EST)&lt;br /&gt;
:Added to article --[[User:Ce|Ce]] 12:56, 19 March 2009 (EDT)&lt;br /&gt;
&lt;br /&gt;
=== libavui (a common skins library)? ===&lt;br /&gt;
Would a common skins library make good project suggestion?&lt;br /&gt;
*MPlayer skin&lt;br /&gt;
*VLC skin&lt;br /&gt;
*Xine skin&lt;br /&gt;
*XMMS skin&lt;br /&gt;
*WINAMP skin&lt;br /&gt;
*Windows Media Player skin&lt;br /&gt;
*Rockbox skin&lt;br /&gt;
*foobar2000 skin&lt;br /&gt;
*Songbird feathers (skin)&lt;br /&gt;
-[[User:Nazo|Nazo]] 21:29, 16 January 2009 (EST)&lt;br /&gt;
: Personally, I would advocate a project to stamp out skinnable UIs across the computing landscape. But that's outside of the scope of an SoC project. I hate UI skins. --[[User:Multimedia Mike|Multimedia Mike]] 14:03, 17 January 2009 (EST)&lt;br /&gt;
:: I second that.  But I don't see how GUI stuff like promoting or discouraging skins relates to libav* in the first place. [[User:Koorogi|Koorogi]] 16:26, 17 January 2009 (EST)&lt;br /&gt;
: No, skins are outside the scope of FFmpeg.--[[User:DonDiego|DonDiego]] 07:18, 18 January 2009 (EST)&lt;br /&gt;
&lt;br /&gt;
==Refactor VDPAU patch for video editing ==&lt;br /&gt;
This might be a good project: http://lists.mplayerhq.hu/pipermail/ffmpeg-devel/2009-January/059032.html but I don't know for sure, so that's why I am including it on this page. [[User:Dashcloud|Dashcloud]] 16:23, 20 January 2009 (EST)&lt;br /&gt;
&lt;br /&gt;
==More pixel format support?==&lt;br /&gt;
Would more pixel format support make good project suggestion? Here is crazy missing pixel format list (from HDPhoto):&lt;br /&gt;
*1/2/4bpp palette - 8bpp is already supported&lt;br /&gt;
*1/2/4/32bpp gray - 8bpp and 16bpp are already supported&lt;br /&gt;
*16bpp gray fixedpoint&lt;br /&gt;
*32bpp gray float&lt;br /&gt;
*48/96bpp RGB - 24bpp is already supported&lt;br /&gt;
*48/64bpp RGB half&lt;br /&gt;
*48/64/96/128bpp RGB fixedpoint&lt;br /&gt;
*32bpp RGB101010&lt;br /&gt;
*96/128bpp RGB float&lt;br /&gt;
*64/128bpp RGBA - 32bpp is already supported&lt;br /&gt;
*64/128bpp RGBA fixedpoint&lt;br /&gt;
*64bpp RGBA half&lt;br /&gt;
*128bpp RGBA float&lt;br /&gt;
*32bpp BGR - 24bpp is already supported&lt;br /&gt;
*32bpp PBGRA&lt;br /&gt;
*64bpp PRGBA&lt;br /&gt;
*128bpp PRGBA float&lt;br /&gt;
*32bpp RGBE&lt;br /&gt;
*32/64bpp CMYK&lt;br /&gt;
*40/80bpp CMYKAlpha&lt;br /&gt;
*12bpp YUV420&lt;br /&gt;
*16bpp YUV422&lt;br /&gt;
*24bpp YUV444&lt;br /&gt;
*24bpp 3Channels&lt;br /&gt;
*32bpp 4Channels&lt;br /&gt;
*40bpp 5Channels&lt;br /&gt;
*48bpp 6Channels&lt;br /&gt;
*56bpp 7Channels&lt;br /&gt;
*64bpp 8Channels&lt;br /&gt;
*48bpp 3Channels&lt;br /&gt;
*64bpp 4Channels&lt;br /&gt;
*80bpp 5Channels&lt;br /&gt;
*96bpp 6Channels&lt;br /&gt;
*112bpp 7Channels&lt;br /&gt;
*128bpp 8Channels&lt;br /&gt;
*32bpp 3ChannelsAlpha&lt;br /&gt;
*40bpp 4ChannelsAlpha&lt;br /&gt;
*48bpp 5ChannelsAlpha&lt;br /&gt;
*56bpp 6ChannelsAlpha&lt;br /&gt;
*64bpp 7ChannelsAlpha&lt;br /&gt;
*72bpp 8ChannelsAlpha&lt;br /&gt;
*64bpp 3ChannelsAlpha&lt;br /&gt;
*80bpp 4ChannelsAlpha&lt;br /&gt;
*96bpp 5ChannelsAlpha&lt;br /&gt;
*112bpp 6ChannelsAlpha&lt;br /&gt;
*128bpp 7ChannelsAlpha&lt;br /&gt;
*144bpp 8ChannelsAlpha&lt;br /&gt;
--[[User:Nazo|Nazo]] 07:24, 21 January 2009 (EST)&lt;br /&gt;
&lt;br /&gt;
: See [[Small FFmpeg Tasks#Generic Colorspace system]]. [[User:Vitor|Vitor]] 14:11, 22 January 2009 (EST)&lt;br /&gt;
:: need samples for each one at least. --[[User:Compn|Compn]] 07:33, 23 January 2009 (EST)&lt;br /&gt;
::: and HDPhoto decoder --[[User:Kostya|Kostya]] 08:06, 23 January 2009 (EST)&lt;br /&gt;
:::: HDPhoto format is similar to tiff. HDPhoto support uncompressed image, I remembered. but I checked, it was for future use:-( --[[User:Nazo|Nazo]] 09:46, 23 January 2009 (EST)&lt;br /&gt;
&lt;br /&gt;
==VC-1 Interlaced Support==&lt;br /&gt;
Blu-Ray media contain interlaced VC-1 which is currently not supported by lavc decoder. Since a lot of people showed the lack of interest to implement it, maybe some student will take this task. --[[User:Kostya|Kostya]] 09:23, 26 January 2009 (EST)&lt;br /&gt;
:Added to article --[[User:Ce|Ce]] 09:53, 4 March 2009 (CST)&lt;br /&gt;
&lt;br /&gt;
==Interactive command ui support==&lt;br /&gt;
Interactive command ui might be good start point to implement missing features for GUI encoders.&lt;br /&gt;
* more presets&lt;br /&gt;
* containable codec in each format&lt;br /&gt;
* mapping between encode options and proper UI types&lt;br /&gt;
* playable codecs and encode options on each player (WMP, RealPlayer, Flash Player, Silverlight, PS3, mobile phones, etc...)&lt;br /&gt;
* list of usable metadata tags in each format&lt;br /&gt;
* etc...&lt;br /&gt;
--[[User:Nazo|Nazo]] 05:22, 28 February 2009 (CST)&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==Maybe remove/downshift some tasks==&lt;br /&gt;
Some tasks would be hard to complete:&lt;br /&gt;
* Flash Screen video 2 codec - without Mike providing documentation it's a bit harder. I remember one student who was promised RV40 specs and got almost nothing; decoder was more or less completed successfully but it took slightly more time.&lt;br /&gt;
* GStreamer input - rather weak excuse. Better make it the task of REing WMA lossless (WMApro is almost there).&lt;br /&gt;
* i263 decoder - FFmpeg supports i263 to some extent. Missing bits are probably make only a small FFmpeg task.&lt;br /&gt;
--[[User:Kostya|Kostya]] 10:39, 4 March 2009 (CST)&lt;br /&gt;
&lt;br /&gt;
I also have my doubts about libvo (typical cleanup task, that has not worked out well in the past) and the AACS task.&lt;br /&gt;
AACS could work out, but IMO only the &amp;quot;last&amp;quot; part, decoding when the title key has already be found fits in FFmpeg, and then I think it should at least be AACS + CSS (I hope they would be able to share some code, at least they use the same bits for flagging the encryption).&lt;br /&gt;
&lt;br /&gt;
[[User:Reimar|Reimar]] 14:26, 19 March 2009 (EDT)&lt;br /&gt;
&lt;br /&gt;
== ffmpeg.c factorization ==&lt;br /&gt;
&lt;br /&gt;
Every once in a while, someone shows up at ffmpeg-devel with a patch for turning ffmpeg.c into a library (for example [http://lists.mplayerhq.hu/pipermail/ffmpeg-devel/2009-January/059828.html]). While this is a bad idea for several reasons, having a lot of people asking for it shows that it is very hard to use libav* in a client application while been as flexible as the command-line tool.&lt;br /&gt;
&lt;br /&gt;
So I suggest a SoC project of factorizing ffmpeg.c into several public, clean, well-documented functions, to simplify the use libav* and turning ffmpeg.c into a good API example.&lt;br /&gt;
&lt;br /&gt;
* '''Note''': The tricky part of this project would be to factorize ffmpeg.c into functions that actually should be in a lib, not just mechanically moving code to libav*/something.c&lt;br /&gt;
&lt;br /&gt;
-[[User:Vitor|Vitor]] 13:45, 4 March 2009 (CST)&lt;br /&gt;
&lt;br /&gt;
== Rate control project ==&lt;br /&gt;
I copied this one from the 08 SoC page because I think it would be a great project- but if people think it doesn't need to be copied from last year's page, then I'll delete the entry. [[User:Dashcloud|Dashcloud]] 22:08, 4 March 2009 (CST)&lt;br /&gt;
&lt;br /&gt;
What does viterbi have to do with macroblock-level ratecontrol, with or without VBV constraints? I can vaguely see viterbi on frame-level VBV compliance, using &amp;quot;remaining VBV space&amp;quot; as the state to be trellised over, if you're willing to quantize possible frame sizes enough to bring the number of states down to something sane. But that isn't relevant to macroblocks: VBV doesn't impose any constraints smaller than a frame, so macroblock-level is plain old independent RDO. [[User:Pengvado|Pengvado]] 08:33, 13 March 2009 (EDT)&lt;br /&gt;
&lt;br /&gt;
== WMA lossless  ==&lt;br /&gt;
&lt;br /&gt;
I'm not sure this would qualify as a project unless someone (generally other than the student, unless he is really gifted) takes the responsibility of reverse engineering it&lt;br /&gt;
&lt;br /&gt;
-[[User:Vitor|Vitor]] 13:50, 6 March 2009 (CST)&lt;br /&gt;
&lt;br /&gt;
The situation is the same as with WMA3 lossy and we have working decoder for it.&lt;br /&gt;
&lt;br /&gt;
--[[User:Kostya|Kostya]] 23:38, 6 March 2009 (CST)&lt;br /&gt;
&lt;br /&gt;
== Shoutcast support ==&lt;br /&gt;
&lt;br /&gt;
Is adding support for shoutcast (streaming &amp;amp; receiving) a viable task? -- Jai&lt;br /&gt;
:preliminary patch here: https://lists.mplayerhq.hu/pipermail/ffmpeg-devel/2009-January/060556.html --[[User:Compn|Compn]] 11:42, 14 March 2009 (EDT)&lt;br /&gt;
&lt;br /&gt;
== Playlist/Concatenation Support ==&lt;br /&gt;
&lt;br /&gt;
I suspect that interface can be easily extended by two options only: -playlist X and -concat, the latter tells FFmpeg to concatenate all input files instead of processing them in parallel. The problem will be mostly in formats negotiations during concatenation (i.e. frame dimensions and rate mismatch for different files). --[[User:Kostya|Kostya]] 07:52, 19 March 2009 (EDT)&lt;/div&gt;</summary>
		<author><name>Reimar</name></author>
	</entry>
	<entry>
		<id>https://wiki.multimedia.cx/index.php?title=Small_FFmpeg_Tasks&amp;diff=11329</id>
		<title>Small FFmpeg Tasks</title>
		<link rel="alternate" type="text/html" href="https://wiki.multimedia.cx/index.php?title=Small_FFmpeg_Tasks&amp;diff=11329"/>
		<updated>2009-03-17T13:19:34Z</updated>

		<summary type="html">&lt;p&gt;Reimar: A demuxer for the Metal Gear Solid is a suitable small task as well&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This page contains ideas for small, relatively simple tasks for the [[FFmpeg]] project. People who might be interested in trying one of these tasks:&lt;br /&gt;
* Someone who wants to contribute to FFmpeg and needs to find a well-defined task to start with&lt;br /&gt;
* Someone who wishes to qualify for one of FFmpeg's coveted [[FFmpeg Summer Of Code|Summer of Code]] project slots&lt;br /&gt;
* An existing FFmpeg developer who has been away from the project for a while and needs a smaller task as motivation for re-learning the codebase&lt;br /&gt;
&lt;br /&gt;
For other tasks of varying difficulty, see the [[Interesting Patches]] page.&lt;br /&gt;
&lt;br /&gt;
'''If you would like to work on one of these tasks''', please take these steps:&lt;br /&gt;
* Subscribe to the [https://lists.mplayerhq.hu/mailman/listinfo/ffmpeg-devel FFmpeg development mailing list] and indicate your interest&lt;br /&gt;
* Ask [[User:Multimedia Mike|Multimedia Mike]] for a Wiki account so you can claim your task on this Wiki&lt;br /&gt;
&lt;br /&gt;
'''If you would like to add to this list''', please be prepared to explain some useful details about the task. Excessively vague tasks with no supporting details will be ruthlessly deleted.&lt;br /&gt;
&lt;br /&gt;
=== Finish up a previous incomplete SoC project ===&lt;br /&gt;
&lt;br /&gt;
Several SoC projects from previous years have not yet made it into FFmpeg. Taking any of them and finishing them up to the point that they can be included should make for a good qualification task. Check out the [[FFmpeg Summer Of Code]] overview page and look for the unfinished projects, like AMR-NB, Dirac, TS muxer, JPEG 2000.&lt;br /&gt;
&lt;br /&gt;
=== Generic Colorspace system ===&lt;br /&gt;
This task involves adding support more than 8 bits per component (Y on 10 bits, U on 10 bits, V on 10 bits for example)&lt;br /&gt;
and generic simple conversion to other colorspaces.&lt;br /&gt;
&lt;br /&gt;
''Does this have to do with revising FFmpeg's infrastructure? If so, then it doesn't feel like a qualification task. If it's something simpler, then the vague description does not convey that simplicity. Please expound.'' --[[User:Multimedia Mike|Multimedia Mike]] 12:56, 25 February 2008 (EST)&lt;br /&gt;
&lt;br /&gt;
''I don't think so, extending PixFmt to extended structure with finegrained description like depth, range values, colorspace, sample period, and write generic simple conversion from all formats to all others, like suggested by Michael on the mailing list. Conversion routine can be a good qualification task for video encoders/decoders. What do you think ?&lt;br /&gt;
--[[User:Bcoudurier|Baptiste Coudurier]] 00:30, 29 February 2008 (EST)&lt;br /&gt;
&lt;br /&gt;
''* Adding the [[YCoCg]] colorspace (with different sized planes) for RGB sourced pictures would be nice too. [[User:Elte|Elte]] 07:15, 16 March 2009 (EDT)&lt;br /&gt;
&lt;br /&gt;
=== Make the SoC dts encoder multichannel capable ===&lt;br /&gt;
Here is a skeleton for a dts encoder http://svn.mplayerhq.hu/soc/dcaenc/, currently it can only encode stereo streams.&lt;br /&gt;
The task is to extend it to support 5.1 channels also.&lt;br /&gt;
&lt;br /&gt;
Specs and info can be found here:&lt;br /&gt;
http://wiki.multimedia.cx/index.php?title=DTS&lt;br /&gt;
&lt;br /&gt;
=== GIF LZW Encoder and extend Encoder and Decoder to support Animated GIFs ===&lt;br /&gt;
&lt;br /&gt;
Lzw encoder is already used for TIFF, it must be extended to support GIF flavor.&lt;br /&gt;
&lt;br /&gt;
=== Patch cleanup for MPEG 1 &amp;amp; 2 optimizations ===&lt;br /&gt;
Details are in the issue tracker: http://roundup.ffmpeg.org/roundup/ffmpeg/issue100&lt;br /&gt;
&lt;br /&gt;
=== Implement a Vivo demuxer for FFmpeg ===&lt;br /&gt;
Implement an FFmpeg demuxer for the [[Vivo]] file format. The best reference for understanding the format would be MPlayer's [http://svn.mplayerhq.hu/mplayer/trunk/libmpdemux/demux_viv.c?view=markup existing .viv demuxer].&lt;br /&gt;
&lt;br /&gt;
This task corresponds to issue 99: http://roundup.ffmpeg.org/roundup/ffmpeg/issue99&lt;br /&gt;
&lt;br /&gt;
''I am ready to help out with understanding MPlayer's demuxer, esp. MPlayer API stuff if necessary.&lt;br /&gt;
--[[User:Reimar|Reimar]] 15:46, 1 March 2008 (EST)&lt;br /&gt;
&lt;br /&gt;
=== Port missing demuxers from MPlayer to FFmpeg ===&lt;br /&gt;
MPlayer supports a few container formats in libmpdemux that are not yet present in libavformat. Porting them over and gettting them relicensed as LGPL or reimplementing them from scratch should make reasonable small tasks.&lt;br /&gt;
&lt;br /&gt;
''Jai Menon is working on porting the tivo demuxer''&lt;br /&gt;
&lt;br /&gt;
=== Optimal Huffman tables for (M)JPEG ===&lt;br /&gt;
This task is outlined at http://guru.multimedia.cx/small-tasks-for-ffmpeg/ and is tracked in the issue tracker: http://roundup.ffmpeg.org/roundup/ffmpeg/issue267&lt;br /&gt;
:Indrani Kundu Saha is currently working on this task as a qualification for Google SoC 2009 --[[User:Ce|Ce]] 19:41, 13 March 2009 (EDT)&lt;br /&gt;
&lt;br /&gt;
=== YOP Playback System ===&lt;br /&gt;
This task is to implement an FFmpeg playback subsystem for [[Psygnosis YOP]] files. This will entail writing a new file demuxer and video decoder, both of which are trivial by FFmpeg standards. [[Psygnosis YOP|The Psygnosis YOP page]] contains the specs necessary to complete this task and points to downloadable samples.&lt;br /&gt;
&lt;br /&gt;
=== M95 Playback System ===&lt;br /&gt;
This task is to implement an FFmpeg playback subsystem for [[M95]] files. This will entail writing a new file demuxer and video decoder (the audio is already uncompressed), both of which are trivial by FFmpeg standards. [[M95|The M95 page]] contains the specs necessary to complete this task and points to downloadable samples.&lt;br /&gt;
&lt;br /&gt;
=== BRP Playback System ===&lt;br /&gt;
This task is to implement an FFmpeg playback subsystem for [[BRP]] files. This will entail writing a new file demuxer as well as a video decoder that can handle at least 2 variations of format data. Further, write an audio decoder for the custom DPCM format in the file. All of these tasks are considered trivial by FFmpeg standards. [[BRP|The BRP page]] contains the specs necessary to complete this task and points to downloadable samples for both known variations.&lt;br /&gt;
&lt;br /&gt;
=== 16-bit Interplay Video Decoder ===&lt;br /&gt;
FFmpeg already supports [[Interplay MVE]] files with [[Interplay Video|8-bit video data]] inside. This task involves supporting 16-bit video data. The video encoding format is mostly the same but the pixel size is twice as large. Engage the ffmpeg-devel list to discuss how best to approach this task.&lt;br /&gt;
&lt;br /&gt;
=== 16-bit VQA Video Decoder ===&lt;br /&gt;
FFmpeg already supports Westwood [[VQA]] files. However, there are 3 variations of its custom video codec. The first 2 are supported in FFmpeg. This task involves implementing support for the 3rd variation. Visit the VQA samples repository: http://samples.mplayerhq.hu/game-formats/vqa/ -- The files in the directories Tiberian Sun VQAs/, bladerunner/, and dune2000/ use the 3rd variation of this codec. The [[VQA|VQA page]] should link to all the details you need to support this format.&lt;br /&gt;
&lt;br /&gt;
=== HNM4 Playback System ===&lt;br /&gt;
This task is to implement an FFmpeg playback subsystem for [[HNM4]] variant of the [[HNM]] format. This will entail writing a new file demuxer and video decoder, both of which are trivial by FFmpeg standards. [[HNM4|The HNM4 page]] contains the specs necessary to complete this task and links to downloadable samples.&lt;br /&gt;
&lt;br /&gt;
=== Apple RPZA encoder ===&lt;br /&gt;
A patch was once sent to the ffmpeg-devel mailing list to include an encoder for the [[Apple RPZA]] video codec. That code can be found on the &amp;quot;[[Interesting Patches]]&amp;quot; page. This qualification task involves applying that patch so that it can compile with current FFmpeg SVN code and then cleaning it up per the standards of the project. Engage the mailing list to learn more about what to do.&lt;br /&gt;
&lt;br /&gt;
=== QuickTime Edit List Support ===&lt;br /&gt;
Implement edit list support in FFmpeg's QuickTime demuxer (libavformat/mov.c). This involves parsing the 'elst' atom in a QuickTime file. For a demonstration of how this is a problem, download the file menace00.mov from http://samples.mplayerhq.hu/mov/editlist/ and play it with ffplay or transcode it with ffmpeg. Notice that the audio and video are ever so slightly out of sync. Proper edit list support will solve that. Other samples in that directory also presumably exhibit edit list-related bugs. The [http://xine.cvs.sourceforge.net/xine/xine-lib/src/demuxers/demux_qt.c?view=markup Xine demuxer] has support for this, it might be useful for hints.&lt;br /&gt;
&lt;br /&gt;
:Krishna Gadepalli is working on this (patch submitted to ffmpeg-devel , currently in review) --[[User:Compn|Compn]] 10:35, 14 March 2009 (EDT)&lt;br /&gt;
&lt;br /&gt;
=== Reimplement libavcodec/fdctref.c ===&lt;br /&gt;
The forward double precision DCT in this file has a non-free license. We need an LGPL replacement of this file.&lt;br /&gt;
&lt;br /&gt;
=== Implement the Flash Screen Video codec version 2 ===&lt;br /&gt;
FFmpeg is missing both a decoder and an encoder. Would be nice to have that.&lt;br /&gt;
&lt;br /&gt;
=== Add wma fixed point decoder back into libavcodec ===&lt;br /&gt;
http://svn.rockbox.org/viewvc.cgi/trunk/apps/codecs/libwma/&lt;br /&gt;
Rockbox's fixed-point WMA decoder was adapted from the decoder in libavcodec.&lt;br /&gt;
&lt;br /&gt;
=== RealAudio 14.4 encoder ===&lt;br /&gt;
FFmpeg contains a decoder for [[RealAudio 14.4]], a farily simple integer CELP codec.  Write an encoder.  This would be a good qualification task for anyone interested in working on AMR, Speex, or sipr.&lt;br /&gt;
&lt;br /&gt;
=== VC1 timestamps in m2ts ===&lt;br /&gt;
&lt;br /&gt;
Codec copy of VC1 from m2ts currently doesn't work. Either extend the VC1 parser to output/fix timestamps, or fix the timestamps from m2ts demuxing.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== FLIC work ===&lt;br /&gt;
&lt;br /&gt;
Revise the [[Flic Video]] decoder at libavcodec/flicvideo.c to support video transported in AVI or MOV files while making sure that data coming from the usual FLI files still works. 'AFLC' and 'flic' FourCC samples are linked from the [[Flic Video]] page.&lt;br /&gt;
&lt;br /&gt;
=== Hook up QT YUV2 FourCC ===&lt;br /&gt;
&lt;br /&gt;
Wire up the YUV2 FourCC that can occur in [[MOV]] to the [[YUV 4:2:2]] colorspace. Samples are linked from the [[YUV 4:2:2]] wiki page.&lt;br /&gt;
[[Category:FFmpeg]]&lt;br /&gt;
&lt;br /&gt;
=== CorePNG Decoder ===&lt;br /&gt;
&lt;br /&gt;
Extend FFmpeg's PNG decoder to handle the difference frames and [[YUV]] colorspace added in [[CorePNG]]. Sample at [http://samples.mplayerhq.hu/V-codecs/PNG1/ http://samples.mplayerhq.hu/V-codecs/PNG1/]&lt;br /&gt;
&lt;br /&gt;
=== Extend PNG Decoder ===&lt;br /&gt;
&lt;br /&gt;
get this png working in ffpng: http://roundup.ffmpeg.org/roundup/ffmpeg/issue813 .&lt;br /&gt;
&lt;br /&gt;
=== CJPG format ===&lt;br /&gt;
&lt;br /&gt;
Extend FFmpeg's MJPEG decoder to handle the different frames/packing of CJPG. Samples at: http://roundup.ffmpeg.org/roundup/ffmpeg/issue777&lt;br /&gt;
&lt;br /&gt;
=== Optimize Theora Decoder ===&lt;br /&gt;
&lt;br /&gt;
speed up the Theora decoder. [http://www.archive.org/download/AlternativeFreedom/alternative_freedom.ogg 720:480 sample] hits 100% cpu on a p4 1.5ghz.&lt;br /&gt;
:''Do you have any specific optimizations tips? I like these small tasks to present a clearer jumping-off point. --[[User:Multimedia Mike|Multimedia Mike]] 18:57, 22 December 2008 (EST)''&lt;br /&gt;
::''did theora make use of the mmx/sse functions of ffvp3? i was looking at the xiph GSOC page which mentioned a similar task. --[[User:Compn|Compn]] 21:17, 22 December 2008 (EST)''&lt;br /&gt;
::''The major optimization I can think of is reworking coefficient decoding to avoid the continue in unpack_vlcs() (basically by having a list of coefficient VLCs for each position rather than for each block, then decoding them when actually rendering the block.) Unfortunately this also requires reworking render_slice() and reverse_dc_prediction() quite significantly which is why I haven't done it yet. [[User:Yuvi|Yuvi]] 18:25, 23 December 2008 (EST)''&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== flip flag for upside-down codecs ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;about the flip, a patch that decodes images fliped when&lt;br /&gt;
codec_tag == ff_get_fourcc(&amp;quot;GEOX&amp;quot;) is welcome.&lt;br /&gt;
its a metter of 2lines manipulating data/linesize of imgages after&lt;br /&gt;
get_buffer() or something similar&lt;br /&gt;
[...]&lt;br /&gt;
-- &lt;br /&gt;
Michael     GnuPG fingerprint: 9FF2128B147EF6730BADF133611EC787040B0FAB&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
more info:&lt;br /&gt;
http://roundup.ffmpeg.org/roundup/ffmpeg/issue741&lt;br /&gt;
&lt;br /&gt;
=== lavf-based concatenation tool ===&lt;br /&gt;
&lt;br /&gt;
Unless we have multiple files input in FFmpeg, it would be nice to have some libavformat-based tool that would extract frames from multiple files (possible different containers as well) and put them into single one.&lt;br /&gt;
&lt;br /&gt;
=== cljr and vcr1 encoders ===&lt;br /&gt;
According to this: http://lists.mplayerhq.hu/pipermail/ffmpeg-devel/2009-February/063647.html both of the encoders are disabled, and won't compile if enabled.  Michael would prefer to keep them around, and have someone grow them into full encoders.&lt;br /&gt;
&lt;br /&gt;
=== Add waveformat extensible support in wav muxer ===&lt;br /&gt;
http://article.gmane.org/gmane.comp.video.ffmpeg.devel/79503&lt;br /&gt;
Clean up that patch.&lt;br /&gt;
&lt;br /&gt;
=== implement some colorspace fourcc/codecs ===&lt;br /&gt;
some colorspace formats were uploaded to http://samples.mplayerhq.hu/V-codecs/&lt;br /&gt;
including:&lt;br /&gt;
 2vuy.avi&lt;br /&gt;
 CYUV.AVI&lt;br /&gt;
 P422.AVI&lt;br /&gt;
 UYNV.AVI&lt;br /&gt;
 UYNY.avi&lt;br /&gt;
 V422.AVI&lt;br /&gt;
 YUNV.AVI&lt;br /&gt;
 a12v.avi&lt;br /&gt;
 auv2.avi&lt;br /&gt;
 and V-codecs/yuv8/MAILTEST.AVI .&lt;br /&gt;
&lt;br /&gt;
step by step tutorial for adding new input formats to swscale:&lt;br /&gt;
 cd mplayer/libswscale/&lt;br /&gt;
 svn di -r20426:20427&lt;br /&gt;
 the hunks 3 and 5 you dont need, they are optional special converters&lt;br /&gt;
 also the change to isSupportedOut() you dont need&lt;br /&gt;
 above will add a new input format&lt;br /&gt;
&lt;br /&gt;
another example for adding an input format&lt;br /&gt;
 cd mplayer/libswscale/&lt;br /&gt;
 svn di -r20604:20605&lt;br /&gt;
&lt;br /&gt;
=== Create a libamr compatible library of the Android amr codec ===&lt;br /&gt;
http://android.git.kernel.org/?p=platform/external/opencore.git;a=tree;f=codecs_v2/audio/gsm_amr/amr_nb;h=4bac3ee5bd1ae8b6955f2d0bdac7de43c0d985c1;hb=HEAD&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Make the rtp demuxer support rtcp BYE packets ===&lt;br /&gt;
rtcp BYE (203) packets are sent from the sender to the receiver to notify that a stream has ended.&lt;br /&gt;
FFmpeg currently ignores them.&lt;br /&gt;
&lt;br /&gt;
Sample url rtsp://media.lscube.org/tests/tc.mov&lt;br /&gt;
&lt;br /&gt;
=== Implement the RTP/Vorbis payload ===&lt;br /&gt;
This is supported by the [http://www.lscube.org/projects/feng feng RTSP server], and is described in [http://tools.ietf.org/html/rfc5215 RFC 5215]. For testing, you can set up a local feng RTSP server to stream some local Vorbis file, or you can use the online feng test-server (rtsp://media.lscube.org:554/tests/rms_profumo_1.ogv).&lt;br /&gt;
&lt;br /&gt;
Most likely, your implementation will consist of a file called rtp_vorbis.c in libavformat/, which will read the header packets available in the SDP (the &amp;quot;configuration&amp;quot; piece in the fmtp: line) and which parses individual incoming RTP packets from the RTSP demuxer (minus the generic RTP header bits). It should output Vorbis-encoded frames which can subsequently be decoded by the Vorbis decoder in libavcodec/.&lt;br /&gt;
&lt;br /&gt;
=== Implement the RTP/Theora payload ===&lt;br /&gt;
The Theora payload is currently still a [http://svn.xiph.org/trunk/theora/doc/draft-ietf-avt-rtp-theora-00.txt draft]. Yet, it would be nice to support this payload. As per above, the [http://www.lscube.org/projects/feng feng RTSP server] supports the Theora RTP payload draft and can be used for testing your implementation of the draft, or you can use the online feng test-server (rtsp://media.lscube.org:554/tests/rms_profumo_1.ogv).&lt;br /&gt;
&lt;br /&gt;
Most likely, your implementation will consist of a file called rtp_theora.c in libavformat/, which will read the header packets available in the SDP (the &amp;quot;configuration&amp;quot; piece in the fmtp: line) and which parses individual incoming RTP packets from the RTSP demuxer (minus the generic RTP header bits). It should output Theora-encoded frames which can subsequently be decoder by the Theora decoder in libavcodec/.&lt;br /&gt;
&lt;br /&gt;
=== cdg decoder + demuxer ===&lt;br /&gt;
create a [[CD Graphics]] decoder/demuxer. implementations: http://www.kibosh.org/pykaraoke/ or http://users.fbihome.de/~glogow/ or http://miageprojet.unice.fr/twiki/bin/view/Fun/JavaKarPlayer or http://www.kibosh.org/cdgtools/ or this http://bat-kolio.net/cdg2video/ (which uses ffmpeg).&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== support for YCoCg/RGB colospace in FFV1 ===&lt;br /&gt;
add support for [[YCoCg]] and [[RGB]] encoded sources&lt;br /&gt;
&lt;br /&gt;
This would add a free lossless intra-frame RGB codec for all by FFmpeg supported platforms (most important MacOS + Windows) which is often asked for video editing in video forums (e.g. slashcam.de)&lt;br /&gt;
&lt;br /&gt;
=== Metal Gear Solid Video format demuxer ===&lt;br /&gt;
http://multimedia.cx/eggs/metal-gear-vp3/&lt;/div&gt;</summary>
		<author><name>Reimar</name></author>
	</entry>
	<entry>
		<id>https://wiki.multimedia.cx/index.php?title=V210&amp;diff=11129</id>
		<title>V210</title>
		<link rel="alternate" type="text/html" href="https://wiki.multimedia.cx/index.php?title=V210&amp;diff=11129"/>
		<updated>2009-02-17T19:33:45Z</updated>

		<summary type="html">&lt;p&gt;Reimar: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[http://developer.apple.com/quicktime/icefloe/dispatch019.html#v210 v210 QuickTime Description]&lt;br /&gt;
&lt;br /&gt;
The v210 format is a packed YUV 4:2:2 (UYVY) format with 10 bits per component.&lt;br /&gt;
&lt;br /&gt;
Data is stored in blocks of 32 bit values in little-endian.&lt;br /&gt;
&lt;br /&gt;
Each such block contains 3 components, one each in bits 0 - 9, 10 - 20 and 20 - 30, the remaining two bits are unused.&lt;br /&gt;
&lt;br /&gt;
Since UYVY consists of 4 components that represent two pixels, which of those bits correspond to which component makes a pattern that repeats every 4 32-bit blocks:&lt;br /&gt;
&lt;br /&gt;
  block 1, bits  0 -  9: U0+1&lt;br /&gt;
  block 1, bits 10 - 20: Y0&lt;br /&gt;
  block 1, bits 20 - 30: V0+1&lt;br /&gt;
  block 2, bits  0 -  9: Y1&lt;br /&gt;
  block 2, bits 10 - 20: U2+3&lt;br /&gt;
  block 2, bits 20 - 30: Y2&lt;br /&gt;
  block 3, bits  0 -  9: V2+3&lt;br /&gt;
  block 3, bits 10 - 20: Y3&lt;br /&gt;
  block 3, bits 20 - 30: U4+5&lt;br /&gt;
  block 4, bits  0 -  9: Y4&lt;br /&gt;
  block 4, bits 10 - 20: V4+5&lt;br /&gt;
  block 4, bits 20 - 30: Y5&lt;br /&gt;
&lt;br /&gt;
In addition the start of each line is aligned to a multiple of 128 bytes, where unused blocks are padded with 0.&lt;br /&gt;
&lt;br /&gt;
Unused parts of a partial (i.e. last) block do not seem to contain any specific value.&lt;br /&gt;
&lt;br /&gt;
FFmpeg does not support this format since there is no support for any 10-bit per component pixel formats.&lt;br /&gt;
Patches that simply discard the two least significant bits are referenced [http://wiki.multimedia.cx/index.php?title=Interesting_Patches#v210_decoder_patches here]&lt;/div&gt;</summary>
		<author><name>Reimar</name></author>
	</entry>
	<entry>
		<id>https://wiki.multimedia.cx/index.php?title=V210&amp;diff=11127</id>
		<title>V210</title>
		<link rel="alternate" type="text/html" href="https://wiki.multimedia.cx/index.php?title=V210&amp;diff=11127"/>
		<updated>2009-02-17T18:34:41Z</updated>

		<summary type="html">&lt;p&gt;Reimar: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The v210 format is a packed YUV 4:2:2 (UYVY) format with 10 bits per component.&lt;br /&gt;
&lt;br /&gt;
Data is stored in blocks of 32 bit values in little-endian.&lt;br /&gt;
&lt;br /&gt;
Each such block contains 3 components, one each in bits 0 - 9, 10 - 20 and 20 - 30, the remaining two bits are unused.&lt;br /&gt;
&lt;br /&gt;
Since UYVY consists of 4 components that represent two pixels, which of those bits correspond to which component makes a pattern that repeats every 4 32-bit blocks:&lt;br /&gt;
&lt;br /&gt;
  block 1, bits  0 -  9: U&lt;br /&gt;
  block 1, bits 10 - 20: Y&lt;br /&gt;
  block 1, bits 20 - 30: V&lt;br /&gt;
  block 2, bits  0 -  9: Y&lt;br /&gt;
  block 2, bits 10 - 20: U&lt;br /&gt;
  block 2, bits 20 - 30: Y&lt;br /&gt;
  block 3, bits  0 -  9: V&lt;br /&gt;
  block 3, bits 10 - 20: Y&lt;br /&gt;
  block 3, bits 20 - 30: U&lt;br /&gt;
  block 4, bits  0 -  9: Y&lt;br /&gt;
  block 4, bits 10 - 20: V&lt;br /&gt;
  block 4, bits 20 - 30: Y&lt;br /&gt;
&lt;br /&gt;
In addition the start of each line is aligned to a multiple of 64 (or possible 128) bytes, where unused blocks are padded with 0.&lt;br /&gt;
&lt;br /&gt;
Unused parts of a partial (i.e. last) block do not seem to contain any specific value.&lt;br /&gt;
&lt;br /&gt;
FFmpeg does not support this format since there is no support for any 10-bit per component pixel formats.&lt;br /&gt;
Patches that simply discard the two least significant bits are referenced [http://wiki.multimedia.cx/index.php?title=Interesting_Patches#v210_decoder_patches here]&lt;/div&gt;</summary>
		<author><name>Reimar</name></author>
	</entry>
	<entry>
		<id>https://wiki.multimedia.cx/index.php?title=Interesting_Patches&amp;diff=11126</id>
		<title>Interesting Patches</title>
		<link rel="alternate" type="text/html" href="https://wiki.multimedia.cx/index.php?title=Interesting_Patches&amp;diff=11126"/>
		<updated>2009-02-17T18:33:30Z</updated>

		<summary type="html">&lt;p&gt;Reimar: /* v210 decoder patches */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This page tries to collect some useful patches for FFmpeg that didn't make into SVN for some reason or another.&lt;br /&gt;
&lt;br /&gt;
== native [[Zlib]] decoder by Mans Rullgard ==&lt;br /&gt;
http://lists.mplayerhq.hu/pipermail/ffmpeg-devel/2007-July/032820.html&lt;br /&gt;
&lt;br /&gt;
In the same thread, there are patches to use the native decoder in several FFmpeg decoders.&lt;br /&gt;
&lt;br /&gt;
== [[WMV3]] encoder by Denis Fortin ==&lt;br /&gt;
http://lists.mplayerhq.hu/pipermail/ffmpeg-devel/2007-June/031699.html&lt;br /&gt;
&lt;br /&gt;
== [[H.263]] rtp patch ==&lt;br /&gt;
http://www.voxgratia.org/bin/ffmpeg-0.4.7.patch.zip, originally at http://www.salyens.com/downloads/index.html#ffmpeg-0.4.7, now removed.&lt;br /&gt;
&lt;br /&gt;
== [[Apple RPZA]] encoder by Todd Kirby ==&lt;br /&gt;
http://lists.mplayerhq.hu/pipermail/ffmpeg-devel/2005-June/001673.html&lt;br /&gt;
&lt;br /&gt;
== Test Pattern Generator Demuxer by Nicholas George ==&lt;br /&gt;
http://lists.mplayerhq.hu/pipermail/ffmpeg-devel/2007-October/036838.html&lt;br /&gt;
&lt;br /&gt;
== Test Pattern Generator Demuxer by [[User:Angustia|Ramiro Ribeiro Polla]] ==&lt;br /&gt;
http://lists.mplayerhq.hu/pipermail/ffmpeg-devel/2007-April/028226.html&lt;br /&gt;
Or &lt;br /&gt;
http://thread.gmane.org/gmane.comp.video.ffmpeg.devel/49447&lt;br /&gt;
&lt;br /&gt;
== PES packetizer by Xiaohui Sun ==&lt;br /&gt;
http://lists.mplayerhq.hu/pipermail/ffmpeg-devel/2007-September/034849.html&lt;br /&gt;
&lt;br /&gt;
Part of the work of [[FFmpeg Summer Of Code#TS Muxer|Summer Of Code TS Muxer]]&lt;br /&gt;
&lt;br /&gt;
== Imlib2script: a scriptable vhook by [[User:Wzrlpy|Víctor Paesa]] ==&lt;br /&gt;
http://thread.gmane.org/gmane.comp.video.ffmpeg.devel/52341&lt;br /&gt;
&lt;br /&gt;
== vf_imlib2: a libavfilter filter by [[User:Wzrlpy|Víctor Paesa]] ==&lt;br /&gt;
http://lists.mplayerhq.hu/pipermail/ffmpeg-soc/2007-December/002162.html&lt;br /&gt;
&lt;br /&gt;
== File concatenation by Wolfram Gloger ==&lt;br /&gt;
http://lists.mplayerhq.hu/pipermail/ffmpeg-devel/2007-July/032131.html&lt;br /&gt;
&lt;br /&gt;
== &amp;quot;mem&amp;quot; file protocol by Lagrange Multiplier ==&lt;br /&gt;
The &amp;quot;mem&amp;quot; protocol simply uses RAM as a source for input multimedia data, akin to how the &amp;quot;file&amp;quot; and &amp;quot;pipe&amp;quot; protocols use filesystem files and pipes as sources.&lt;br /&gt;
&lt;br /&gt;
http://lists.mplayerhq.hu/pipermail/ffmpeg-devel/2007-May/028489.html&lt;br /&gt;
&lt;br /&gt;
== Presets/profiles for usual targets by Panagiotis Issaris ==&lt;br /&gt;
Allow to keep in a text file groups of command options, and apply them at once by specifying the target name.&lt;br /&gt;
&lt;br /&gt;
Handy for iPod, PSP, or any other picky multimedia player that otherwise requires lengthy command lines.&lt;br /&gt;
&lt;br /&gt;
http://article.gmane.org/gmane.comp.video.ffmpeg.devel/37244&lt;br /&gt;
&lt;br /&gt;
== PiP (Picture in Picture): a vhook filter by Mihail Stoyanov ==&lt;br /&gt;
http://article.gmane.org/gmane.comp.video.ffmpeg.devel/38896&lt;br /&gt;
&lt;br /&gt;
== [[AMV]] encoder ==&lt;br /&gt;
http://code.google.com/p/amv-codec-tools/&lt;br /&gt;
&lt;br /&gt;
See this post [http://lists.mplayerhq.hu/pipermail/ffmpeg-devel/2007-October/037356.html] to see what is missing to get it into SVN.&lt;br /&gt;
&lt;br /&gt;
== [[Electronic Arts Formats]] demuxer/decoder by [[User:Suxen drol|Peter Ross]]==&lt;br /&gt;
http://lists.mplayerhq.hu/pipermail/ffmpeg-devel/2007-October/036938.html&lt;br /&gt;
The format demuxer modifications and the EA video codecs have not yet been applied to FFmpeg.&lt;br /&gt;
&lt;br /&gt;
== Experimental MSVC port by Ole André Vadla Ravnås ==&lt;br /&gt;
Code in the [http://bazaar-vcs.org bazaar] branch at http://people.collabora.co.uk/~oleavr/OABuild/bzr/ffmpeg/&lt;br /&gt;
&lt;br /&gt;
Patch at http://lists.mplayerhq.hu/pipermail/ffmpeg-devel/2008-March/044463.html&lt;br /&gt;
&lt;br /&gt;
== H264 encoder by Jori Liesenborgs &amp;amp; Panagiotis Issaris ==&lt;br /&gt;
http://research.edm.uhasselt.be/~h264/&lt;br /&gt;
&lt;br /&gt;
== DTS/AC3 in wav autodetection ==&lt;br /&gt;
http://thread.gmane.org/gmane.comp.video.ffmpeg.devel/49812/focus=49909&lt;br /&gt;
Clean up this patch and also add detection of AC3 in wav, it is similar. Samples for both can be found here: http://www.sr.se/cgi-bin/mall/artikel.asp?ProgramID=2445&amp;amp;Artikel=739973&lt;br /&gt;
&lt;br /&gt;
== [[Bink Audio]] decoder by [[User:Suxen drol|Peter Ross]] ==&lt;br /&gt;
http://lists.mplayerhq.hu/pipermail/ffmpeg-devel/2008-April/045346.html&lt;br /&gt;
&lt;br /&gt;
Note: An updated patch is under development by [[User:DrV]] based on an updated patch by the original [[User:Suxen drol|author]].&lt;br /&gt;
&lt;br /&gt;
== G722 decoder by Chas Williams ==&lt;br /&gt;
http://lists.mplayerhq.hu/pipermail/ffmpeg-devel/2008-June/048457.html&lt;br /&gt;
&lt;br /&gt;
It is basically an adaptation to FFmpeg of the [http://www.soft-switch.org/ SirenDSP] decoder.&lt;br /&gt;
&lt;br /&gt;
== [[Chinese AVS]] video encoder by [[User:StefanG|Stefan Gehrer]] ==&lt;br /&gt;
http://lists.mplayerhq.hu/pipermail/ffmpeg-devel/2007-July/033286.html&lt;br /&gt;
&lt;br /&gt;
==  Lossless msmpeg4v3 to mpeg4 transcoder ==&lt;br /&gt;
http://thread.gmane.org/gmane.comp.video.ffmpeg.devel/17074&lt;br /&gt;
&lt;br /&gt;
== Fixed point cook decoder ==&lt;br /&gt;
http://thread.gmane.org/gmane.comp.video.ffmpeg.devel/46024&lt;br /&gt;
http://thread.gmane.org/gmane.comp.video.ffmpeg.devel/54008&lt;br /&gt;
http://thread.gmane.org/gmane.comp.video.ffmpeg.devel/54553&lt;br /&gt;
[[Category:FFmpeg]]&lt;br /&gt;
&lt;br /&gt;
== GDI screen grabbing for Win32 ==&lt;br /&gt;
http://thread.gmane.org/gmane.comp.video.ffmpeg.devel/43589&lt;br /&gt;
&lt;br /&gt;
There are two implementations in the thread above.&lt;br /&gt;
&lt;br /&gt;
== [[RealAudio sipr|RealAudio SIPR]] @16k decoder and demuxer by [[User:Voroshil|Vladimir Voroshilov]] ==&lt;br /&gt;
&lt;br /&gt;
http://lists.mplayerhq.hu/pipermail/ffmpeg-devel/2008-September/052961.html&lt;br /&gt;
&lt;br /&gt;
Expected to work with FFmpeg r15192&lt;br /&gt;
&lt;br /&gt;
== [[QCELP]] reference decoder wrapper by Moriyoshi Koizumi ==&lt;br /&gt;
&lt;br /&gt;
http://lists.mplayerhq.hu/pipermail/ffmpeg-devel/2006-December/020223.html&lt;br /&gt;
&lt;br /&gt;
== Proper parsing of DTS-HD MA streams ==&lt;br /&gt;
http://lists.mplayerhq.hu/pipermail/ffmpeg-devel/2008-November/056526.html&lt;br /&gt;
&lt;br /&gt;
== [[ACELP.net]] and G.729 decoder ==&lt;br /&gt;
[http://lists.mplayerhq.hu/pipermail/ffmpeg-devel/2008-May/046518.html filters]&lt;br /&gt;
&lt;br /&gt;
[http://lists.mplayerhq.hu/pipermail/ffmpeg-devel/2008-May/046519.html pitch lag decoding]&lt;br /&gt;
&lt;br /&gt;
[http://lists.mplayerhq.hu/pipermail/ffmpeg-devel/2008-May/046520.html vectors operations]&lt;br /&gt;
&lt;br /&gt;
[http://lists.mplayerhq.hu/pipermail/ffmpeg-devel/2008-May/046521.html G.729 core]&lt;br /&gt;
&lt;br /&gt;
[http://lists.mplayerhq.hu/pipermail/ffmpeg-devel/2008-May/046522.html G.729 tables]&lt;br /&gt;
&lt;br /&gt;
[http://lists.mplayerhq.hu/pipermail/ffmpeg-devel/2008-May/046523.html G.729 postfilter]&lt;br /&gt;
&lt;br /&gt;
[http://lists.mplayerhq.hu/pipermail/ffmpeg-devel/2008-May/046524.html G.729D decoder]&lt;br /&gt;
&lt;br /&gt;
== v210 decoder patches ==&lt;br /&gt;
[http://thread.gmane.org/gmane.comp.video.ffmpeg.devel/54195 http://thread.gmane.org/gmane.comp.video.ffmpeg.devel/54195]&lt;br /&gt;
&lt;br /&gt;
[http://lists.mplayerhq.hu/pipermail/ffmpeg-devel/2009-February/062881.html http://lists.mplayerhq.hu/pipermail/ffmpeg-devel/2009-February/062881.html]&lt;/div&gt;</summary>
		<author><name>Reimar</name></author>
	</entry>
	<entry>
		<id>https://wiki.multimedia.cx/index.php?title=V210&amp;diff=11125</id>
		<title>V210</title>
		<link rel="alternate" type="text/html" href="https://wiki.multimedia.cx/index.php?title=V210&amp;diff=11125"/>
		<updated>2009-02-17T18:28:13Z</updated>

		<summary type="html">&lt;p&gt;Reimar: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The v210 format is a packed YUV 4:2:2 (UYVY) format with 10 bits per component.&lt;br /&gt;
&lt;br /&gt;
Data is stored in blocks of 32 bit values in little-endian.&lt;br /&gt;
&lt;br /&gt;
Each such block contains 3 components, one each in bits 0 - 9, 10 - 20 and 20 - 30, the remaining two bits are unused.&lt;br /&gt;
&lt;br /&gt;
Since UYVY consists of 4 components that represent two pixels, which of those bits correspond to which component makes a pattern that repeats every 4 32-bit blocks:&lt;br /&gt;
&lt;br /&gt;
  block 1, bits  0 -  9: U&lt;br /&gt;
  block 1, bits 10 - 20: Y&lt;br /&gt;
  block 1, bits 20 - 30: V&lt;br /&gt;
  block 2, bits  0 -  9: Y&lt;br /&gt;
  block 2, bits 10 - 20: U&lt;br /&gt;
  block 2, bits 20 - 30: Y&lt;br /&gt;
  block 3, bits  0 -  9: V&lt;br /&gt;
  block 3, bits 10 - 20: Y&lt;br /&gt;
  block 3, bits 20 - 30: U&lt;br /&gt;
  block 4, bits  0 -  9: Y&lt;br /&gt;
  block 4, bits 10 - 20: V&lt;br /&gt;
  block 4, bits 20 - 30: Y&lt;br /&gt;
&lt;br /&gt;
In addition the start of each line is aligned to a multiple of 64 (or possible 128) bytes, where unused blocks are padded with 0.&lt;br /&gt;
&lt;br /&gt;
Unused parts of a partial (i.e. last) block do not seem to contain any specific value.&lt;/div&gt;</summary>
		<author><name>Reimar</name></author>
	</entry>
	<entry>
		<id>https://wiki.multimedia.cx/index.php?title=V210&amp;diff=11124</id>
		<title>V210</title>
		<link rel="alternate" type="text/html" href="https://wiki.multimedia.cx/index.php?title=V210&amp;diff=11124"/>
		<updated>2009-02-17T18:26:34Z</updated>

		<summary type="html">&lt;p&gt;Reimar: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The v210 format is a packed YUV 4:2:2 (UYVY) format with 10 bits per component.&lt;br /&gt;
Data is stored in blocks of 32 bit values in little-endian.&lt;br /&gt;
Each such block contains 3 components, one each in bits 0 - 9, 10 - 20 and 20 - 30, the remaining two bits are unused.&lt;br /&gt;
Since UYVY consists of 4 components that represent two pixels, which of those bits correspond to which component makes a pattern that repeats every 4 32-bit blocks:&lt;br /&gt;
block 1, bits  0 -  9: U&lt;br /&gt;
block 1, bits 10 - 20: Y&lt;br /&gt;
block 1, bits 20 - 30: V&lt;br /&gt;
block 2, bits  0 -  9: Y&lt;br /&gt;
block 2, bits 10 - 20: U&lt;br /&gt;
block 2, bits 20 - 30: Y&lt;br /&gt;
block 3, bits  0 -  9: V&lt;br /&gt;
block 3, bits 10 - 20: Y&lt;br /&gt;
block 3, bits 20 - 30: U&lt;br /&gt;
block 4, bits  0 -  9: Y&lt;br /&gt;
block 4, bits 10 - 20: V&lt;br /&gt;
block 4, bits 20 - 30: Y&lt;br /&gt;
&lt;br /&gt;
In addition the start of each line is aligned to a multiple of 64 (or possible 128) bytes, where unused blocks are padded with 0.&lt;br /&gt;
Unused parts of a partial (i.e. last) block do not seem to contain any specific value.&lt;/div&gt;</summary>
		<author><name>Reimar</name></author>
	</entry>
</feed>