FFmpeg
|
ac3enc_template.c
Go to the documentation of this file.
static void scale_coefficients(AC3EncodeContext *s)
Definition: start.py:1
static CoefType calc_cpl_coord(CoefSumType energy_ch, CoefSumType energy_cpl)
int AC3_NAME() allocate_sample_buffers(AC3EncodeContext *s)
Definition: ac3enc_template.c:51
void ff_ac3_process_exponents(AC3EncodeContext *s)
Calculate final exponents from the supplied MDCT coefficients and exponent shift. ...
Definition: ac3enc.c:635
int ff_ac3_validate_metadata(AC3EncodeContext *s)
Validate metadata options as set by AVOption system.
Definition: ac3enc.c:1831
static void apply_channel_coupling(AC3EncodeContext *s)
Definition: ac3enc_template.c:128
void(* extract_exponents)(uint8_t *exp, int32_t *coef, int nb_coefs)
Definition: ac3dsp.h:127
uint8_t new_cpl_coords[AC3_MAX_CHANNELS]
send new coupling coordinates (cplcoe)
Definition: ac3enc.h:147
uint8_t cpl_master_exp[AC3_MAX_CHANNELS]
coupling coord master exponents (mstrcplco)
Definition: ac3enc.h:148
void(* mdct_calcw)(struct FFTContext *s, FFTDouble *output, const FFTSample *input)
Definition: fft.h:84
these buffered frames must be flushed immediately if a new input produces new the filter must not call request_frame to get more It must just process the frame or queue it The task of requesting more frames is left to the filter s request_frame method or the application If a filter has several the filter must be ready for frames arriving randomly on any input any filter with several inputs will most likely require some kind of queuing mechanism It is perfectly acceptable to have a limited queue and to drop frames when the inputs are too unbalanced request_frame This method is called when a frame is wanted on an output For an input
Definition: filter_design.txt:216
overlapping window(triangular window to avoid too much overlapping) ovidx
static int normalize_samples(AC3EncodeContext *s)
common internal API header
int ff_ac3_compute_bit_allocation(AC3EncodeContext *s)
Definition: ac3enc.c:1144
void ff_ac3_adjust_frame_size(AC3EncodeContext *s)
Adjust the frame size to make the average bit rate match the target bit rate.
Definition: ac3enc.c:181
int ff_alloc_packet2(AVCodecContext *avctx, AVPacket *avpkt, int size)
Check AVPacket size and/or allocate data.
Definition: libavcodec/utils.c:1377
int AC3_NAME() encode_frame(AVCodecContext *avctx, AVPacket *avpkt, const AVFrame *frame, int *got_packet_ptr)
Definition: ac3enc_template.c:388
1i.*Xphase exp()
static void apply_window(void *dsp, SampleType *output, const SampleType *input, const SampleType *window, unsigned int len)
void ff_ac3_output_frame(AC3EncodeContext *s, unsigned char *frame)
Write the frame to the output bitstream.
Definition: ac3enc.c:1659
Filter the word “frame” indicates either a video frame or a group of audio as stored in an AVFilterBuffer structure Format for each input and each output the list of supported formats For video that means pixel format For audio that means channel sample they are references to shared objects When the negotiation mechanism computes the intersection of the formats supported at each end of a all references to both lists are replaced with a reference to the intersection And when a single format is eventually chosen for a link amongst the remaining all references to the list are updated That means that if a filter requires that its input and output have the same format amongst a supported all it has to do is use a reference to the same list of formats query_formats can leave some formats unset and return AVERROR(EAGAIN) to cause the negotiation mechanism toagain later.That can be used by filters with complex requirements to use the format negotiated on one link to set the formats supported on another.Buffer references ownership and permissions
void ff_ac3_quantize_mantissas(AC3EncodeContext *s)
Quantize mantissas using coefficients, exponents, and bit allocation pointers.
Definition: ac3enc.c:1298
uint8_t cpl_band_sizes[AC3_MAX_CPL_BANDS]
number of coeffs in each coupling band
Definition: ac3enc.h:212
static void sum_square_butterfly(AC3EncodeContext *s, CoefSumType sum[4], const CoefType *coef0, const CoefType *coef1, int len)
these buffered frames must be flushed immediately if a new input produces new output(Example:frame rate-doubling filter:filter_frame must(1) flush the second copy of the previous frame, if it is still there,(2) push the first copy of the incoming frame,(3) keep the second copy for later.) If the input frame is not enough to produce output
static void compute_rematrixing_strategy(AC3EncodeContext *s)
Definition: ac3enc_template.c:336
static void copy_input_samples(AC3EncodeContext *s, SampleType **samples)
Definition: ac3enc_template.c:75
Filter the word “frame” indicates either a video frame or a group of audio samples
Definition: filter_design.txt:2
void ff_ac3_apply_rematrixing(AC3EncodeContext *s)
Apply stereo rematrixing to coefficients based on rematrixing flags.
Definition: ac3enc.c:270
const uint8_t ff_ac3_rematrix_band_tab[5]
Table of bin locations for rematrixing bands reference: Section 7.5.2 Rematrixing : Frequency Band De...
Definition: ac3tab.c:139
static av_always_inline int64_t ff_samples_to_time_base(AVCodecContext *avctx, int64_t samples)
Rescale from sample rate to AVCodecContext.time_base.
Definition: libavcodec/internal.h:186
Definition: ac3.h:77
void ff_ac3_compute_coupling_strategy(AC3EncodeContext *s)
Set the initial coupling strategy parameters prior to coupling analysis.
Definition: ac3enc.c:199
void(* float_to_fixed24)(int32_t *dst, const float *src, unsigned int len)
Convert an array of float in range [-1.0,1.0] to int32_t with range [-(1<<24),(1<<24)].
Definition: ac3dsp.h:89
static void clip_coefficients(DSPContext *dsp, CoefType *coef, unsigned int len)
Generated on Fri Dec 20 2024 06:55:57 for FFmpeg by 1.8.11