Fractals/Computer graphic techniques/workflow

Image processing or pipeline of 2D graphic or workflow

Graphics pipeline in wikipedia

Color Theory: Color_gradient =Stages=

Image processing stages:
 * colour calibration ( for all devices taking an active part of a colour-managed workflow)
 * preprocessing
 * processing ( capturing image = taking the photo, creating digital image )
 * postprocessing = editing a captured image = modification of the image: Graphic algorithms
 * postprocessing raw files

Film ( motion picture) production consists of five major stages:
 * Development: Ideas for the film are created, rights to existing intellectual properties are purchased, etc., and the screenplay is written. Financing for the project is sought and obtained.
 * Pre-production: Arrangements and preparations are made for the shoot, such as hiring cast and film crew, selecting locations and constructing sets.
 * Production: The raw footage and other elements of the film are recorded during the film shoot, including principal photography.
 * Post-production: The images, sound, and visual effects of the recorded film are edited and combined into a finished product.
 * Distribution: The completed film is distributed, marketed, and screened in cinemas and/or released to home video to be viewed.

=color spaces=

For correct results, different Color Spaces are needed for:
 * rendering,
 * display and printing
 * storage of images.

Rendering and compositing is best done in scene linear color space, which corresponds more closely to nature, and makes computations more physically accurate.

=2D pipeline=
 * choose algorithm
 * choose image size in pixels ( iWidth x iHeight = iX_Size x iY_Size)
 * choose plane viewport ( rectangle defined 4 corner points or center, radius and aspect ratio )
 * Choose plane transformation
 * creating empty array
 * Fil array
 * Save array as raw file
 * postprocessing (pl. Poprodukcja)

Another example, for each pixel of a image
 * take integer coordinate of memmory 1D array
 * compute integer coordinate of virtual 2D array
 * compute world coordinate of the pixel
 * compute value of representation function for this pixel
 * compute color of the pixel

Data structures
 * memmory 1D array
 * virtual 2D array

Files
 * raw files with data
 * raster image file with colors ( png, pgm, ... )
 * mixed file with both above informations ( like exr)

"use pro editing techniques by working non-destructively. This means, don’t edit on the original image layer. Instead, duplicate the layer for every type of edit you plan to do. This way, you can repeat the editing steps with a duplicate file at 16 bits." Chris Parker

Steps
 * clipping : covert object in world coordinate to object subset = = applaying world window to object
 * window to vieport mapping
 * rasterisation = (scan conversion) – Convert high level object descriptions to pixel colors in the frame buffer
 * display/ save

RAW
The raw graphic file
 * contains raw pixel data (numbers) in binary format
 * programs can save the raw iteration data for later colouring or other analysis.
 * because the file contains no image formatting information before reading/opening the file one needs informations about:
 * Image dimensions
 * pixel data format
 * Pixel scan order
 * Byte arrangement (little or big endian format)

stages of raw file processing

Actions:
 * reading: from the disc and save data to memory array
 * Colorize = convert raw data info of Pixel to Pixel color.
 * Exposure Value adjustment
 * White balance adjustment
 * Hue and tone adjustment
 * Highlight and shadow recovery
 * Vibrance and saturation adjustment
 * Cropping & Rotation
 * Noise reduction
 * Sharpening
 * Some of the actions that are preferred after conversion to jpg files are
 * Red-eye removal
 * Local touch up of cloning to erase unwanted object in frame
 * Adding of frame
 * Mixing with other output files, like changing the background.
 * saving

Convert and colorize, for each pixel do:

pgm[i] = raw[i]•m

 Examples of RAW Image processing

Example by Kyung-Hoe Huh
 * Raw image before processing
 * Binary image
 * Outline image
 * Skeletonized image after preprocessing
 * Composition image of raw and outline images
 * Composition image of raw and skeletonized image

Image Processing Over Time of A Raw Fractal by ImageJ

Tips
 * "the process of applying full-screen filters and effects to a camera’s image buffer before it is displayed to screen. It can drastically improve the visuals of your product with little setup time."

full HDR pipeline

 * HDR image
 * HLG or PQ transfer function
 * HDR monitor

HDR Content Production Workflow by EIZO:
 * checking the data from a shoot
 * VFX = Visual effects
 * compositing work
 * final color grading.

HDR to SDR pipeline
From HDR to tone mapped SDR
 * HDR image
 * Reinhard & Drago tone mapping
 * SDR display = SDR image tone mapped from HDR

full SDR pipeline

 * SDR image
 * gamma transfer function ( gamma correction)
 * SDR monitor ( SRGB)

color workflow
Rendering intent : color operations should be done with the use case in mind:
 * to either model human perception
 * or the physical behavior of light ( Physically Based Rendering PBR)
 * a linear version of RGB ( not sRGB)

Color space:
 * Images are stored on disk and passed to displays in sRGB, which is approximately perceptually uniform in intensity.
 * Shader math is done in linear RGB, which is physically uniform in intensity.
 * priniting is in CMYK

See also rendering styles:
 * Unbiased rendering = Photorealistic rendering
 * Non-photorealistic rendering
 * Artistic rendering
 * hyperrealistic rendering = Rendering styles that combine photorealism with non-photorealism

Examples

 * krita : color managed workflow
 * blender : blender : official color management and wikibooks
 * efi: fiery color imaging workflow
 * skia color menagment
 * Adobe

Correct procedure when working with graphics is to convert all gamma encoded media to linear intensities, perform any calculation/blending/averaging, and convert back to gamma encoding.

Rendering and compositing is best done in scene linear color space, which corresponds more closely to nature, and makes computations more physically accurate. ( Blender doc)

linear workflow
The Linear Workflow uses linear color space for rendering
 * "In the real world, the colors we perceive are all in linear color space. In order for the monitor to mimic the real world, it needs to also display in linear fashion. Due to the limitations of the monitor hardware and image optimization requirements, images are processed and saved with Gamma correction."
 * Due to limitations of old CRT monitor technology, images can only be displayed via gamma 2.2 color correction. Even though modern monitors can directly display linear images, gamma 2.2 has become something of a standard convention
 * There is no need to linearize floating point image formats such as .hdr and .exr (32 bit) which are already in gamma space 1.0.
 * When a camera snaps an image, an 8 bit image is saved for monitor compatibility and file optimization via sRGB Gamma 0.45 processing.

sRGB Image Gamma 0.45 + Monitor Gamma 2.2 = Final Display Gamma 1

Workflow in Unity3D
Unity3D
 * Linear color space workflow: linear color space rendering gives more precise results
 * Gamma color space workflow: gamma color space is the historically standard format = don't use it

Color buffers by Poimandres
( Open source developer collective ):
 * UnsignedByteType sRGB frame buffers ( = 8 bits per color channel) to store intermediate results. This is a trade-off between hardware support, efficiency and quality. With low precision sRGB buffers, colors will be clamped to [0.0, 1.0] and information loss will shift to the darker spectrum which leads to noticable banding in dark scenes.
 * linear results normally require at least 12 bits per color channel to prevent color degradation and banding. Linear, high precision HalfFloatType buffers don't have these issues and are the preferred option for HDR-like workflows on desktop devices.

Color Pipeline in CUVI
Krita (a GPU-accelerated library for Imaging and Computer Vision applications.)

Let's take a typical color pipeline and measure its performance on one of the entry level GPUs. Any color pipeline almost always starts with the Raw image. Before converting to RGB, you might want to do some processing on the raw which may include applying LUTs (look up tables), FPN (fixed point noise) removal and fixing white balance. Next comes demosaic/debayer followed by several further enhancement functions and a color space conversion into the desired format. This pipeline can perform in real-time on a decent entry level GPU on an 8k images and at over 100 FPS on a 2k image:

Desktop Publishing Workflow

 * GIMP provides top-notch color management features to ensure high-fidelity color reproduction across digital and printed media.
 * It is best used in workflows involving other free software such as Scribus, Inkscape, and SwatchBooker.

Krita color workflow
Krita has two systems dedicated to color management
 * lcms2, which deal with ICC profiles for keeping colors consistent over many interpretations of devices (screens, printers) by using a reference space. for connection with programs like : Gimp 2.9+, Inkscape, digiKam and Scribus
 * OCIO, which deal with LUT color management for manipulating the interpretation of said colors, for connection with programs like Blender and Natron

skia
Skia: The 2D Graphics Library

All the color spaces Skia works with describe themselves by how to transform colors from that color space to a common “connection” color space called XYZ D50.

And we can infer from that same description how to transform from that XYZ D50 space back to the original color space.

XYZ D50 is a color space represented in three dimensions like RGB, but the XYZ parts are not RGB-like at all, rather a linear remix of those channels. Y is closest to what you’d think of as brightness, but X and Z are a little more abstract. It’s kind of like YUV if you’re familiar with that. The “D50” part refers to the whitepoint of this space, around 5000 Kelvin.

All color managed drawing is divided into six parts:
 * three steps connecting the source colors to that XYZ D50 space
 * then three symmetric steps connecting back from XYZ D50 to the destination color space.

Some of these steps can annihilate with each other into no-ops, sometimes all the way to the entire process amounting to a no-op when the source space and destination space are the same.

Color management steps
 * unpremultiply if the source color is premultiplied – alpha is not involved in color management, and we need to divide it out if it’s multiplied in linearize the source color using the source color space’s transfer function
 * convert those unpremultiplied, linear source colors to XYZ D50 gamut by multiplying by a 3x3 matrix
 * convert those XYZ D50 colors to the destination gamut by multiplying by a 3x3 matrix
 * encode that color using the inverse of the destination color space’s transfer function
 * premultiply by alpha if the destination is premultiplied

Requirements for a color-managed workflow

 * Accurate device profiles obtained with source or output characterization software.
 * Correctly loaded video card LUTs (or monitor profiles that do not require LUT adjustments).
 * Color-managed applications that are configured to use a correct monitor profile and input/output profiles, with support for control over the rendering intent and black point compensation.

Calibration and profiling requires:
 * for input devices (scanner, camera, etc.) a color target which the profiling software will compare to the manufacturer-provided color values of the target.
 * or for output devices (monitor, printer, etc.) a reading with a specific device (spectrophotometer, colorimeter or spectrocolorimeter) of the color patch values and comparing the measured values against the values originally sent for output.

Monitor calibration and profiling
One of the critical elements in any color-managed workflow is the monitor, because, at one step or another, handling and making color adaptation through imaging software is required for most images, thus the ability of the monitor to present accurate colors is crucial.

Monitor color management consists of calibration and profiling. The first step, calibration, is done by adjusting the monitor controls and the output of the graphics card (via calibration curves) to match user-definable characteristics, such as brightness, white point and gamma. The calibration settings are stored in a  file. The second step, profiling (characterization), involves measuring the calibrated display's response and recording it in a color profile. The profile is stored in an  file ("ICC file"). For convenience, the calibration settings are usually stored together with the profile in the ICC file.

Note that  files are identical to   files - the difference is only in the name.

Seeing correct colors requires using a monitor profile-aware application, together with the same calibration used when profiling the monitor. Calibration alone does not yield accurate colors. If a monitor was calibrated before it was profiled, the profile will only yield correct colors when used on the monitor with the same calibration (the same monitor control adjustments and the same calibration curves loaded into the video card's lookup table). macOS has built-in support for loading calibration curves and installing a system-wide color profile. Windows 7 onward allows loading calibration curves, though this functionality must be enabled manually. Linux and older versions of Windows require using a standalone LUT loader.

Device profiles
ICC profiles are cross-platform and can thus be created on other operating systems and used under Linux. Monitor profiles, however, require some additional attention. Since a monitor profile depends both on the monitor itself and on the video card, a monitor profile should only be used with the same monitor and video card with which it was created. The monitor settings should not be adjusted after creating the profile. In addition, since most calibration software use LUT adjustments during calibration, the corresponding LUTs must be loaded every time the display server (X11, Wayland) is started (e.g. with every graphical login).

In the unlikely case of a colorimeter being unsupported by Linux, a profile created under Windows or macOS can be used under Linux.

Display-channel lookup tables
There are two approaches to loading display channel LUTs:


 * 1) Create a profile that does not modify video card LUTs and thus does not require LUTs be loaded later on. Ideally, this approach would rely on Display Data Channel (DDC)-capable monitors—the internal monitor settings of which are set via calibration software. Unfortunately, monitors capable of making these adjustments through Display Data Channel (DDC)) are not common and are generally expensive. There is only one calibration software on Linux that can interact with a Display Data Channel (DDC) monitor. For mainstream monitors, a couple of options exist:
 * 2) * BasICColor software, which works with most colorimeters on the market, allows one to adjust display output via the monitor interface, and then to choose a "Profile, do not calibrate" option. By doing this, one can create a profile that does not require video card LUT adjustments.
 * 3) * For EyeOne devices, EyeOne Match allows the user to calibrate to "Native" gamma and white point targets, which results in the LUT adjustment curves displayed after the calibration as a simple, linear 1:1 mapping (a straight line from corner to corner).
 * 4) * Both BasICColor and EyeOne Match do not presently run under Linux but they are capable of creating a profile that does not require LUT adjustments.
 * 5) Use an LUT loader to actually load the LUT adjustments contained within the profile prepared during calibration. According to the documentation, these loaders do not modify the video card LUT by itself, but achieve the same type of adjustment by modifying the X server gamma ramp. Loaders are available for Linux distributions that use X.org or XFree86—the two most popular X servers on Linux. Other X servers are not guaranteed to work with the currently available loaders. There are two LUT loaders available for Linux:
 * 6) * Xcalib is one such loader, and although it is a command-line utility, it is quite easy to use.
 * 7) * dispwin is a part of Argyll CMS.
 * 8) * If, for any reason, the LUT cannot be loaded, it is still recommended to go through the initial stages of calibration where a user is asked by calibration software to make some manual adjustments to the monitor, as this will often improve display linearity and also provide information on its color temperature. This is especially recommended for CRT monitors.

Color-managed applications
In ICC-aware applications, it is important to make sure the correct profiles are assigned to devices, mainly to the monitor and the printer. Some Linux applications can auto-detect the monitor profile, while others requires that it is specified manually.

Although there is no designated place to store device profiles on Linux,  has become the de facto standard.

Most applications running under WINE have not been fully tested for color accuracy. While 8-bits per pixel (bpp) programs can have some color resolution difficulties due to depth conversion errors, colors in higher-depth applications should be accurate, as long as those programs perform their gamut conversions based on the same monitor profile as that used for loading the LUT, granted that the corresponding LUT adjustments are loaded.

Hardware acceleration

 * OpenGl
 * Vulkan
 * SYCL
 * OpenCl
 * CUDA
 * HIP = Heterogeneous-compute Interface for Portability
 * OpenMP
 * OpenACC

=See also=
 * 3D pipeline
 * Category:Graphics_pipeline in commons

=References=
 * benq: how-photographers-incorporate-with-color-management-workflow
 * Basics of Image Processing — Vincent Mazet (Université de Strasbourg), 2020-2023 — CC BY-NC 4.0.
 * image-editor-effects by Alain Galvan
 * CCD ASTROPHOTOGRAPHY processing by Pedro Ré
 * colormanagement.org