Skip to content
Projects
Groups
Snippets
Help
Loading...
Help
Contribute to GitLab
Sign in / Register
Toggle navigation
F
ffmpeg.wasm-core
Project
Project
Details
Activity
Cycle Analytics
Repository
Repository
Files
Commits
Branches
Tags
Contributors
Graph
Compare
Charts
Issues
0
Issues
0
List
Board
Labels
Milestones
Merge Requests
0
Merge Requests
0
CI / CD
CI / CD
Pipelines
Jobs
Schedules
Charts
Wiki
Wiki
Snippets
Snippets
Members
Members
Collapse sidebar
Close sidebar
Activity
Graph
Charts
Create a new issue
Jobs
Commits
Issue Boards
Open sidebar
Linshizhi
ffmpeg.wasm-core
Commits
dc7ad85c
Commit
dc7ad85c
authored
Jan 02, 2012
by
Clément Bœsch
Committed by
Clément Bœsch
Jan 04, 2012
Browse files
Options
Browse Files
Download
Email Patches
Plain Diff
doc: use @command{} for commands.
parent
83712656
Hide whitespace changes
Inline
Side-by-side
Showing
7 changed files
with
30 additions
and
30 deletions
+30
-30
encoders.texi
doc/encoders.texi
+1
-1
ffprobe.texi
doc/ffprobe.texi
+1
-1
filters.texi
doc/filters.texi
+1
-1
indevs.texi
doc/indevs.texi
+12
-12
libavfilter.texi
doc/libavfilter.texi
+7
-7
outdevs.texi
doc/outdevs.texi
+1
-1
protocols.texi
doc/protocols.texi
+7
-7
No files found.
doc/encoders.texi
View file @
dc7ad85c
...
...
@@ -577,7 +577,7 @@ Allow to set any x264 option, see x264 --fullhelp for a list.
":".
@end table
For example to specify libx264 encoding options with @
file
{ffmpeg}:
For example to specify libx264 encoding options with @
command
{ffmpeg}:
@example
ffmpeg -i foo.mpg -vcodec libx264 -x264opts keyint=123:min-keyint=20 -an out.mkv
@end example
...
...
doc/ffprobe.texi
View file @
dc7ad85c
...
...
@@ -135,7 +135,7 @@ Read @var{input_file}.
@chapter Writers
@c man begin WRITERS
A writer defines the output format adopted by @
file
{
ffprobe
}
, and will be
A writer defines the output format adopted by @
command
{
ffprobe
}
, and will be
used for printing all the parts of the output.
A writer may accept one or more arguments, which specify the options to
...
...
doc/filters.texi
View file @
dc7ad85c
...
...
@@ -354,7 +354,7 @@ A customized down-mix to stereo that works automatically for 3-, 4-, 5- and
pan=stereo: FL < FL + 0.5*FC + 0.6*BL + 0.6*SL : FR < FR + 0.5*FC + 0.6*BR + 0.6*SR
@end example
Note that @
file
{ffmpeg} integrates a default down-mix (and up-mix) system
Note that @
command
{ffmpeg} integrates a default down-mix (and up-mix) system
that should be preferred (see "-ac" option) unless you have very specific
needs.
...
...
doc/indevs.texi
View file @
dc7ad85c
...
...
@@ -196,12 +196,12 @@ device.
Once you have created one or more JACK readable clients, you need to
connect them to one or more JACK writable clients.
To connect or disconnect JACK clients you can use the
@file{jack_connect} and @file{jack_disconnect} programs, or do it
through a graphical interface, for example with @file
{qjackctl}.
To connect or disconnect JACK clients you can use the
@command{jack_connect}
and @command{jack_disconnect} programs, or do it through a graphical interface,
for example with @command
{qjackctl}.
To list the JACK clients and their properties you can invoke the command
@
file
{jack_lsp}.
@
command
{jack_lsp}.
Follows an example which shows how to capture a JACK readable client
with @command{ffmpeg}.
...
...
@@ -260,7 +260,7 @@ device.
@itemize
@item
Create a color video stream and play it back with @
file
{ffplay}:
Create a color video stream and play it back with @
command
{ffplay}:
@example
ffplay -f lavfi -graph "color=pink [out0]" dummy
@end example
...
...
@@ -280,14 +280,14 @@ ffplay -f lavfi -graph "testsrc [out0]; testsrc,hflip [out1]; testsrc,negate [ou
@item
Read an audio stream from a file using the amovie source and play it
back with @
file
{ffplay}:
back with @
command
{ffplay}:
@example
ffplay -f lavfi "amovie=test.wav"
@end example
@item
Read an audio stream and a video stream and play it back with
@
file
{ffplay}:
@
command
{ffplay}:
@example
ffplay -f lavfi "movie=test.avi[out0];amovie=test.wav[out1]"
@end example
...
...
@@ -380,7 +380,7 @@ $ ffmpeg -f openal -i '' out.ogg
@end example
Capture from two devices simultaneously, writing to two different files,
within the same @
file
{ffmpeg} command:
within the same @
command
{ffmpeg} command:
@example
$ ffmpeg -f openal -i 'DR-BT101 via PulseAudio' out1.ogg -f openal -i 'ALSA Default' out2.ogg
@end example
...
...
@@ -415,7 +415,7 @@ The filename to provide to the input device is a source device or the
string "default"
To list the pulse source devices and their properties you can invoke
the command @
file
{pactl list sources}.
the command @
command
{pactl list sources}.
@example
ffmpeg -f pulse -i default /tmp/pulse.wav
...
...
@@ -516,8 +516,8 @@ the device.
Video4Linux and Video4Linux2 devices only support a limited set of
@var{width}x@var{height} sizes and frame rates. You can check which are
supported for example with the command @
file
{dov4l} for Video4Linux
devices and the command @
file
{v4l-info} for Video4Linux2 devices.
supported for example with the command @
command
{dov4l} for Video4Linux
devices and the command @
command
{v4l-info} for Video4Linux2 devices.
If the size for the device is set to 0x0, the input device will
try to auto-detect the size to use.
...
...
@@ -579,7 +579,7 @@ default to 0.
Check the X11 documentation (e.g. man X) for more detailed information.
Use the @
file
{dpyinfo} program for getting basic information about the
Use the @
command
{dpyinfo} program for getting basic information about the
properties of your X11 display (e.g. grep for "name" or "dimensions").
For example to grab from @file{:0.0} using @command{ffmpeg}:
...
...
doc/libavfilter.texi
View file @
dc7ad85c
...
...
@@ -43,13 +43,13 @@ The result will be that in output the top half of the video is mirrored
onto the bottom half.
Video filters are loaded using the @var
{
-vf
}
option passed to
ffmpeg or to ffplay. Filters in the same linear chain are separated by
c
ommas. In our example, @var
{
split, fifo, overlay
}
are in one linear
chain, and @var
{
fifo, crop, vflip
}
are in another. The points where
the linear chains join are labeled by names enclosed in square
brackets. In our example, that is @var
{
[T1]
}
and @var
{
[T2]
}
. The magic
labels @var
{
[in]
}
and @var
{
[out]
}
are the points where video is input
and output.
@command
{
ffmpeg
}
or to @command
{
ffplay
}
. Filters in the same linear
c
hain are separated by commas. In our example, @var
{
split, fifo,
overlay
}
are in one linear chain, and @var
{
fifo, crop, vflip
}
are in
another. The points where the linear chains join are labeled by names
enclosed in square brackets. In our example, that is @var
{
[T1]
}
and
@var
{
[T2]
}
. The magic labels @var
{
[in]
}
and @var
{
[out]
}
are the points
where video is input
and output.
Some filters take in input a list of parameters: they are specified
after the filter name and an equal sign, and are separated each other
...
...
doc/outdevs.texi
View file @
dc7ad85c
...
...
@@ -60,7 +60,7 @@ If not specified it defaults to the size of the input video.
@subsection Examples
The following command shows the @
file
{ffmpeg} output is an
The following command shows the @
command
{ffmpeg} output is an
SDL window, forcing its size to the qcif format:
@example
ffmpeg -i INPUT -vcodec rawvideo -pix_fmt yuv420p -window_size qcif -f sdl "SDL output"
...
...
doc/protocols.texi
View file @
dc7ad85c
...
...
@@ -52,7 +52,7 @@ resource to be concatenated, each one possibly specifying a distinct
protocol.
For example to read a sequence of files @file{split1.mpeg},
@file{split2.mpeg}, @file{split3.mpeg} with @
file
{ffplay} use the
@file{split2.mpeg}, @file{split3.mpeg} with @
command
{ffplay} use the
command:
@example
ffplay concat:split1.mpeg\|split2.mpeg\|split3.mpeg
...
...
@@ -183,7 +183,7 @@ application specified in @var{app}, may be prefixed by "mp4:".
@end table
For example to read with @
file
{ffplay} a multimedia resource named
For example to read with @
command
{ffplay} a multimedia resource named
"sample" from the application "vod" from an RTMP server "myserver":
@example
ffplay rtmp://myserver/vod/sample
...
...
@@ -224,7 +224,7 @@ For example, to stream a file in real-time to an RTMP server using
ffmpeg -re -i myfile -f flv rtmp://myserver/live/mystream
@end example
To play the same stream using @
file
{ffplay}:
To play the same stream using @
command
{ffplay}:
@example
ffplay "rtmp://myserver/live/mystream live=1"
@end example
...
...
@@ -249,7 +249,7 @@ The required syntax for a RTSP url is:
rtsp://@var{hostname}[:@var{port}]/@var{path}
@end example
The following options (set on the @command{ffmpeg}/@
file
{ffplay} command
The following options (set on the @command{ffmpeg}/@
command
{ffplay} command
line, or set in code via @code{AVOption}s or in @code{avformat_open_input}),
are supported:
...
...
@@ -288,7 +288,7 @@ When receiving data over UDP, the demuxer tries to reorder received packets
order for this to be enabled, a maximum delay must be specified in the
@code{max_delay} field of AVFormatContext.
When watching multi-bitrate Real-RTSP streams with @
file
{ffplay}, the
When watching multi-bitrate Real-RTSP streams with @
command
{ffplay}, the
streams to display can be chosen with @code{-vst} @var{n} and
@code{-ast} @var{n} for video and audio respectively, and can be switched
on the fly by pressing @code{v} and @code{a}.
...
...
@@ -365,13 +365,13 @@ To broadcast a stream on the local subnet, for watching in VLC:
ffmpeg -re -i @var{input} -f sap sap://224.0.0.255?same_port=1
@end example
Similarly, for watching in
ffplay
:
Similarly, for watching in
@command{ffplay}
:
@example
ffmpeg -re -i @var{input} -f sap sap://224.0.0.255
@end example
And for watching in
ffplay
, over IPv6:
And for watching in
@command{ffplay}
, over IPv6:
@example
ffmpeg -re -i @var{input} -f sap sap://[ff0e::1:2:3:4]
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment