MeGUI/FAQ

What does MeGUI do? What basic use-cases will MeGUI apply?
You can add an answer here...
 * web.archive.org/web/20140214061901/ //mewiki.project357.com/wiki/Main_Page mewiki.project357.com/wiki/Main_Page

Where can I get the latest version of MeGUI?
SourceForge has a copy of MeGUI that, if not the latest, you can use it to update to the latest.

How can I use the development version of MeGUI?
You can change to the bleeding edge development builds in options -> settings -> extra config -> configure servers.

Which version do you recommend - stable or development?
The development builds are the latest builds, straight from the SVN. The problem is that these may have more bugs. To get possibly more stable builds, but with less features, get the builds on

How do I report a bug or request a feature?
MeGUI uses the sourceforge trackers for these two areas. You can view and post both bug reports and feature requests without an account.

What version of the NET framework can I use?
Versions 2.0 and 3.0 should work.

The path to save ___ is invalid
This error usually occurs if you haven't installed AviSynth before you install MeGUI. The solution is to install avisynth, the latest stable version is preferred.

What are workers?
A worker for MeGUI is a ' thread ' that can process jobs in the queue. One worker can process one job at a time. There are a few reasons you'd want to run more than one job at once: You can read more about workers here: MeGUI Parallel Job Execution.
 * You are running one job and it's not using all the CPU (i.e., it's IO limited, or something else)
 * You want to run one job straight away (i.e., run a small encode and skip the long queue)

How do I use delay?
If the file has a delay value in its filename (Delay __ms), or you otherwise know you need a delay, you will have to compensate for this in one of two places in MeGUI. The first is to write the delay value as is into the muxer config when you mux your final file. The second, alternate (i.e., only use one of these methods), is to add a delay when you reencode the audio in the audio config, again entering the delay as is.

How do I import audio?
If simply adding the audio file does not work, you can create an Avisynth file to load it for you. Create a text file with the following line: directshowsource("path\to\file.mp3",audio=true) and save it as a .avs. You will be able to use this file to import to whatever format you want.

How do I set the aspect ratio of my video?
It's on the video preview window after opening a video in the main window. See Aspect ratio signaling in AviSynth scripts for a more in-depth treatment.

My source VOBs are split up! How can I tell MeGUI's d2v indexer to import them properly?
If the vobs are named in the format VTS_xx_yy.vob, where xx is the same for every vob but yy is different, (i.e. VTS_02_01, VTS_02_02, etc.) and they are in the same directory, they will automatically be linked.

My logs say my video is not divisible by 16, should I be worried?
All video encoders in MeGUI work best when the video resolution is equally divisible by 16 (aka mod16). If the video is not so, the encoder will pad it upwards to the next whole multiple of 16. In other words, if your video can't be mod16 for some reason, try to ensure the encoder has to do as little as possible.

The quality loss from non mod16 encoding is lower at higher resolutions, and is generally small.

Encoding with xvid doesn't work!
Check in the log. If there's a line like this: Error opening input file extra\_____.cqm, you do not have the correct quantization matrix in the correct path. You need to download the .cqm file and place it into the default location of program files\megui\tools\xvidencraw\extra. Alternately you can manually select the path to the .cqm in the xvid codec config. You can find the .cqms in the attachment in this post.

How does the number of threads affect quality?
x264 implements threading with two models. The first is Avisynth input threading, and the second is parallel frames encoding.
 * Threaded Avisynth input: If the input is an .avs script, this setting will tell x264 to decode the avs in its own thread. This is especially useful for CPU intensive scripts, but also gives a slight advantage for even the fastest script. In MeGUI, this option is always on as it cannot reduce encoding speed nor quality.
 * Parallel frames encoding: The parallel frames encoding method was introduced in recent x264 revisions and is similar to the Xvid 1.2 implementation of multithreaded encoding. It is more efficient in both speed and final quality than the previously used slicing method implementation. This option is in MeGUI's x264 codec config. The commandline switch is --threads n.

What's the difference between Constant Quantizer and Constant Quality?
These two modes are variations on the generic idea of "unknown bitrate, known quality" where the encoder aims to encode to a specified quality level. This is opposed to the normal "known bitrate, unknown quality" model where the encoder is given an average bitrate and must produce the best file possible with that. The advantage of the former is obviously that the quality can be precisely set, while the latter allows precise filesize control. Which one is right for you is your decision. Note that a 1pass constant quality/quantizer encode will not look as good as a 2pass encode to the same filesize.

In x264, there are two modes of "known quality", Constant Quantizer (CQ) and Constant Quality (aka CRF, Constant Ratefactor).
 * Constant quantizer: every frame is encoded with a mathematically identical quantizer. Constant quantizer produces a file that for the x264 program is of perfect constant quality (it would be 'interpreted' in a similar fashion by other video encoder programs).
 * Constant quality (aka constant rate factor): the video is encoded to a nominal quantizer, but the encoder varies the quantizer on different frames to give a higher perceived quality for human eyes. The output will probably be around the same size as a CQ encode (Your Mileage May Vary), but it will look subjectively better to humans and is therefore generally the more used of these two modes.