Discussion:
[Spice-devel] [spice v12 00/26] Add GStreamer support for video streaming
Francois Gouget
2016-04-05 15:14:56 UTC
Permalink
This patch series adds support for using GStreamer to encode and
decode the video streams, adding support for VP8 and h264 codecs.

As before the patches can also be grabbed from the gst branch of the
repositories below:

spice: https://github.com/fgouget/spice
spice-gtk: https://github.com/fgouget/spice-gtk
xf86-video-qxl: https://github.com/fgouget/xf86-video-qxl
spice-protocol: https://github.com/fgouget/spice-protocol

See also gst-sync for the old spice-gtk code. (there's also 'extras'
branches with more experimental/future patches for the curious)


Besides rebases, the changes from v11 are limited to spice-gtk.
Specifically they fix two compatibility issues:
* V11 used a GAsyncQueue and relied on g_async_queue_push_front() which
is only available in GLib >= 2.46 which is unfortunately not
available in Debian 8 and presumably on many other Linux
distributions.
* It also associated the original network message with the
corresponding GStreamer buffer using the GstMeta API. This is
necessary in order to know where to display the decoded frame as this
could change from one frame to the next. However some GStreamer
elements have (refcounting) bugs that caused the metadata to be lost
in the pipeline. I had a workaround in place that worked on Debian
Testing but did not on Debian 8.
https://bugzilla.gnome.org/show_bug.cgi?id=757254

So in the end I reworked both by putting the metadata in a regular
GQueue and using the PTS timestamp to match elements in that queue to
GStreamer's buffers.


Patches and changes from v10:
server: Check the client video codec capabilities
Fixed the case where the server and client cannot agree on the
video codec to use.
server: Let the administrator pick the video encoder and codec
The video_codecs GArray is now considered immutable which avoids
copying around and avoids having to make the
RED_WORKER_MESSAGE_SET_VIDEO_CODECS RPC synchronous.
server: Use the optimal number of threads for VP8 encoding
Using the optimal number of cores for the VP8 encoder is now in a
separate patch (though skipping it will cause conflicts with
patches 14 and 16).
server: Avoid copying the input frame in the GStreamer encoder
Changed the zero-copy approach to not require ref/unref() of
RedDrawable objects to be thread-safe. Thread-safety aspects are
instead handled in gstreamer-encoder.c now.
server: Adjust the frame rate based on the GStreamer encoding time
This patch limits the frame rate (i.e. drops frames) when the
server has trouble keeping up with the encoding. This code seems
to only be needed in SpiceDeferredFPS mode (so maybe it's in the
SpiceDeferredFPS code that something should be modified). In
any case this patch can be skipped without any impact on the rest
of the series.
server: Respect the GStreamer encoder's valid bit rate range
set_gstenc_bitrate() should now pass the right integer type to
g_object_set().
server: Add support for GStreamer 0.10
Reduced the number of #ifdefs for GStreamer 0.10 using
Christophe's suggestions.
spice-gtk: Add a GStreamer video decoder for MJPEG, VP8 and h264
- Tweaked some function names in the client's GStreamer decoder to
avoid potential name conflicts.
- Fixed a race condition in the client which would cause it to
freeze when hovering madly over the seek barin YouTube videos.
- As a side effect the new code now schedules the frames rending
itself so it can adapt when the mm-time get yanked around.
- It also does not queue the frames in the pipeline and thus is less
likely to lose them when size changes force us to rebuild it.
spice-gtk: Avoid GAsyncQueue for compatibility with GLib < 2.46
This patch is optional and avoids the dependency on
g_async_queue_push_front() which is missing in GLib < 2.46
(e.g. for Debian 8).
--
2.8.0.rc3
Francois Gouget
2016-04-05 15:15:03 UTC
Permalink
This replaces the original mjpeg_encoder API with a VideoEncoder base
class which can be reimplemented by other encoders.
This also renames the members and enums from mjpeg_* to video_*.

Signed-off-by: Francois Gouget <***@codeweavers.com>
---
server/Makefile.am | 2 +-
server/dcc-send.c | 25 ++++----
server/dcc.c | 24 ++++----
server/dcc.h | 2 +-
server/mjpeg-encoder.c | 85 +++++++++++++++-----------
server/mjpeg-encoder.h | 102 -------------------------------
server/stream.c | 42 ++++++-------
server/stream.h | 4 +-
server/video-encoder.h | 160 +++++++++++++++++++++++++++++++++++++++++++++++++
9 files changed, 261 insertions(+), 185 deletions(-)
delete mode 100644 server/mjpeg-encoder.h
create mode 100644 server/video-encoder.h

diff --git a/server/Makefile.am b/server/Makefile.am
index a7a8d9f..8763f54 100644
--- a/server/Makefile.am
+++ b/server/Makefile.am
@@ -87,7 +87,6 @@ libserver_la_SOURCES = \
main-channel.c \
main-channel.h \
mjpeg-encoder.c \
- mjpeg-encoder.h \
red-channel.c \
red-channel.h \
red-common.h \
@@ -123,6 +122,7 @@ libserver_la_SOURCES = \
sound.h \
stat.h \
spicevmc.c \
+ video-encoder.h \
zlib-encoder.c \
zlib-encoder.h \
image-cache.h \
diff --git a/server/dcc-send.c b/server/dcc-send.c
index eb866cf..2fd4129 100644
--- a/server/dcc-send.c
+++ b/server/dcc-send.c
@@ -1691,7 +1691,7 @@ static int red_marshall_stream_data(RedChannelClient *rcc,
uint64_t time_now = spice_get_monotonic_time_ns();
size_t outbuf_size;

- if (!dcc->use_mjpeg_encoder_rate_control) {
+ if (!dcc->use_video_encoder_rate_control) {
if (time_now - agent->last_send_time < (1000 * 1000 * 1000) / agent->fps) {
agent->frames--;
#ifdef STREAM_STATS
@@ -1706,25 +1706,26 @@ static int red_marshall_stream_data(RedChannelClient *rcc,
drawable->red_drawable->mm_time :
reds_get_mm_time();
outbuf_size = dcc->send_data.stream_outbuf_size;
- ret = mjpeg_encoder_encode_frame(agent->mjpeg_encoder,
- &image->u.bitmap, width, height,
- &drawable->red_drawable->u.copy.src_area,
- stream->top_down, frame_mm_time,
- &dcc->send_data.stream_outbuf,
- &outbuf_size, &n);
+ ret = agent->video_encoder->encode_frame(agent->video_encoder,
+ frame_mm_time,
+ &image->u.bitmap, width, height,
+ &drawable->red_drawable->u.copy.src_area,
+ stream->top_down,
+ &dcc->send_data.stream_outbuf,
+ &outbuf_size, &n);
switch (ret) {
- case MJPEG_ENCODER_FRAME_DROP:
- spice_assert(dcc->use_mjpeg_encoder_rate_control);
+ case VIDEO_ENCODER_FRAME_DROP:
+ spice_assert(dcc->use_video_encoder_rate_control);
#ifdef STREAM_STATS
agent->stats.num_drops_fps++;
#endif
return TRUE;
- case MJPEG_ENCODER_FRAME_UNSUPPORTED:
+ case VIDEO_ENCODER_FRAME_UNSUPPORTED:
return FALSE;
- case MJPEG_ENCODER_FRAME_ENCODE_DONE:
+ case VIDEO_ENCODER_FRAME_ENCODE_DONE:
break;
default:
- spice_error("bad return value (%d) from mjpeg_encoder_encode_frame", ret);
+ spice_error("bad return value (%d) from VideoEncoder::encode_frame", ret);
return FALSE;
}
dcc->send_data.stream_outbuf_size = outbuf_size;
diff --git a/server/dcc.c b/server/dcc.c
index c952042..99b2540 100644
--- a/server/dcc.c
+++ b/server/dcc.c
@@ -346,7 +346,7 @@ static void dcc_init_stream_agents(DisplayChannelClient *dcc)
pipe_item_init(&agent->create_item, PIPE_ITEM_TYPE_STREAM_CREATE);
pipe_item_init(&agent->destroy_item, PIPE_ITEM_TYPE_STREAM_DESTROY);
}
- dcc->use_mjpeg_encoder_rate_control =
+ dcc->use_video_encoder_rate_control =
red_channel_client_test_remote_cap(RED_CHANNEL_CLIENT(dcc), SPICE_DISPLAY_CAP_STREAM_REPORT);
}

@@ -477,9 +477,9 @@ static void dcc_destroy_stream_agents(DisplayChannelClient *dcc)
StreamAgent *agent = &dcc->stream_agents[i];
region_destroy(&agent->vis_region);
region_destroy(&agent->clip);
- if (agent->mjpeg_encoder) {
- mjpeg_encoder_destroy(agent->mjpeg_encoder);
- agent->mjpeg_encoder = NULL;
+ if (agent->video_encoder) {
+ agent->video_encoder->destroy(agent->video_encoder);
+ agent->video_encoder = NULL;
}
}
}
@@ -1384,19 +1384,19 @@ static int dcc_handle_stream_report(DisplayChannelClient *dcc,
}

agent = &dcc->stream_agents[report->stream_id];
- if (!agent->mjpeg_encoder) {
+ if (!agent->video_encoder) {
return TRUE;
}

spice_return_val_if_fail(report->unique_id == agent->report_id, TRUE);

- mjpeg_encoder_client_stream_report(agent->mjpeg_encoder,
- report->num_frames,
- report->num_drops,
- report->start_frame_mm_time,
- report->end_frame_mm_time,
- report->last_frame_delay,
- report->audio_delay);
+ agent->video_encoder->client_stream_report(agent->video_encoder,
+ report->num_frames,
+ report->num_drops,
+ report->start_frame_mm_time,
+ report->end_frame_mm_time,
+ report->last_frame_delay,
+ report->audio_delay);
return TRUE;
}

diff --git a/server/dcc.h b/server/dcc.h
index 071a9fc..436d0be 100644
--- a/server/dcc.h
+++ b/server/dcc.h
@@ -109,7 +109,7 @@ struct DisplayChannelClient {
QRegion surface_client_lossy_region[NUM_SURFACES];

StreamAgent stream_agents[NUM_STREAMS];
- int use_mjpeg_encoder_rate_control;
+ int use_video_encoder_rate_control;
uint32_t streams_max_latency;
uint64_t streams_max_bit_rate;
bool gl_draw_ongoing;
diff --git a/server/mjpeg-encoder.c b/server/mjpeg-encoder.c
index c80febd..5ec0753 100644
--- a/server/mjpeg-encoder.c
+++ b/server/mjpeg-encoder.c
@@ -20,7 +20,7 @@
#endif

#include "red-common.h"
-#include "mjpeg-encoder.h"
+#include "video-encoder.h"
#include "utils.h"
#include <jerror.h>
#include <jpeglib.h>
@@ -154,7 +154,8 @@ typedef struct MJpegEncoderRateControl {
uint64_t warmup_start_time;
} MJpegEncoderRateControl;

-struct MJpegEncoder {
+typedef struct MJpegEncoder {
+ VideoEncoder base;
uint8_t *row;
uint32_t row_size;
int first_frame;
@@ -166,13 +167,13 @@ struct MJpegEncoder {
void (*pixel_converter)(void *src, uint8_t *dest);

MJpegEncoderRateControl rate_control;
- MJpegEncoderRateControlCbs cbs;
+ VideoEncoderRateControlCbs cbs;

/* stats */
uint64_t starting_bit_rate;
uint64_t avg_quality;
uint32_t num_frames;
-};
+} MJpegEncoder;

static void mjpeg_encoder_process_server_drops(MJpegEncoder *encoder);
static uint32_t get_min_required_playback_delay(uint64_t frame_enc_size,
@@ -184,8 +185,9 @@ static inline int rate_control_is_active(MJpegEncoder* encoder)
return encoder->cbs.get_roundtrip_ms != NULL;
}

-void mjpeg_encoder_destroy(MJpegEncoder *encoder)
+static void mjpeg_encoder_destroy(VideoEncoder *video_encoder)
{
+ MJpegEncoder *encoder = (MJpegEncoder*)video_encoder;
free(encoder->cinfo.dest);
jpeg_destroy_compress(&encoder->cinfo);
free(encoder->row);
@@ -724,7 +726,7 @@ static int mjpeg_encoder_start_frame(MJpegEncoder *encoder,
interval = (now - rate_control->bit_rate_info.last_frame_time);

if (interval < NSEC_PER_SEC / rate_control->adjusted_fps) {
- return MJPEG_ENCODER_FRAME_DROP;
+ return VIDEO_ENCODER_FRAME_DROP;
}

mjpeg_encoder_adjust_params_to_bit_rate(encoder);
@@ -772,14 +774,14 @@ static int mjpeg_encoder_start_frame(MJpegEncoder *encoder,
break;
default:
spice_debug("unsupported format %d", format);
- return MJPEG_ENCODER_FRAME_UNSUPPORTED;
+ return VIDEO_ENCODER_FRAME_UNSUPPORTED;
}

if (encoder->pixel_converter != NULL) {
unsigned int stride = width * 3;
/* check for integer overflow */
if (stride < width) {
- return MJPEG_ENCODER_FRAME_UNSUPPORTED;
+ return VIDEO_ENCODER_FRAME_UNSUPPORTED;
}
if (encoder->row_size < stride) {
encoder->row = spice_realloc(encoder->row, stride);
@@ -799,7 +801,7 @@ static int mjpeg_encoder_start_frame(MJpegEncoder *encoder,

encoder->num_frames++;
encoder->avg_quality += quality;
- return MJPEG_ENCODER_FRAME_ENCODE_DONE;
+ return VIDEO_ENCODER_FRAME_ENCODE_DONE;
}

static int mjpeg_encoder_encode_scanline(MJpegEncoder *encoder,
@@ -923,27 +925,31 @@ static int encode_frame(MJpegEncoder *encoder, const SpiceRect *src,
return TRUE;
}

-int mjpeg_encoder_encode_frame(MJpegEncoder *encoder,
- const SpiceBitmap *bitmap, int width, int height,
- const SpiceRect *src,
- int top_down, uint32_t frame_mm_time,
- uint8_t **outbuf, size_t *outbuf_size,
- int *data_size)
+static int mjpeg_encoder_encode_frame(VideoEncoder *video_encoder,
+ uint32_t frame_mm_time,
+ const SpiceBitmap *bitmap,
+ int width, int height,
+ const SpiceRect *src, int top_down,
+ uint8_t **outbuf, size_t *outbuf_size,
+ int *data_size)
{
+ MJpegEncoder *encoder = (MJpegEncoder*)video_encoder;
+
int ret = mjpeg_encoder_start_frame(encoder, bitmap->format,
- width, height, outbuf, outbuf_size,
- frame_mm_time);
- if (ret != MJPEG_ENCODER_FRAME_ENCODE_DONE) {
+ width, height,
+ outbuf, outbuf_size,
+ frame_mm_time);
+ if (ret != VIDEO_ENCODER_FRAME_ENCODE_DONE) {
return ret;
}

if (!encode_frame(encoder, src, bitmap, top_down)) {
- return MJPEG_ENCODER_FRAME_UNSUPPORTED;
+ return VIDEO_ENCODER_FRAME_UNSUPPORTED;
}

*data_size = mjpeg_encoder_end_frame(encoder);

- return MJPEG_ENCODER_FRAME_ENCODE_DONE;
+ return VIDEO_ENCODER_FRAME_ENCODE_DONE;
}


@@ -1174,14 +1180,15 @@ static uint32_t get_min_required_playback_delay(uint64_t frame_enc_size,
#define MJPEG_VIDEO_VS_AUDIO_LATENCY_FACTOR 1.25
#define MJPEG_VIDEO_DELAY_TH -15

-void mjpeg_encoder_client_stream_report(MJpegEncoder *encoder,
- uint32_t num_frames,
- uint32_t num_drops,
- uint32_t start_frame_mm_time,
- uint32_t end_frame_mm_time,
- int32_t end_frame_delay,
- uint32_t audio_delay)
+static void mjpeg_encoder_client_stream_report(VideoEncoder *video_encoder,
+ uint32_t num_frames,
+ uint32_t num_drops,
+ uint32_t start_frame_mm_time,
+ uint32_t end_frame_mm_time,
+ int32_t end_frame_delay,
+ uint32_t audio_delay)
{
+ MJpegEncoder *encoder = (MJpegEncoder*)video_encoder;
MJpegEncoderRateControl *rate_control = &encoder->rate_control;
MJpegEncoderClientState *client_state = &rate_control->client_state;
uint64_t avg_enc_size = 0;
@@ -1282,8 +1289,9 @@ void mjpeg_encoder_client_stream_report(MJpegEncoder *encoder,
}
}

-void mjpeg_encoder_notify_server_frame_drop(MJpegEncoder *encoder)
+static void mjpeg_encoder_notify_server_frame_drop(VideoEncoder *video_encoder)
{
+ MJpegEncoder *encoder = (MJpegEncoder*)video_encoder;
encoder->rate_control.server_state.num_frames_dropped++;
mjpeg_encoder_process_server_drops(encoder);
}
@@ -1320,24 +1328,33 @@ static void mjpeg_encoder_process_server_drops(MJpegEncoder *encoder)
server_state->num_frames_dropped = 0;
}

-uint64_t mjpeg_encoder_get_bit_rate(MJpegEncoder *encoder)
+static uint64_t mjpeg_encoder_get_bit_rate(VideoEncoder *video_encoder)
{
+ MJpegEncoder *encoder = (MJpegEncoder*)video_encoder;
return encoder->rate_control.byte_rate * 8;
}

-void mjpeg_encoder_get_stats(MJpegEncoder *encoder, MJpegEncoderStats *stats)
+static void mjpeg_encoder_get_stats(VideoEncoder *video_encoder,
+ VideoEncoderStats *stats)
{
+ MJpegEncoder *encoder = (MJpegEncoder*)video_encoder;
spice_assert(encoder != NULL && stats != NULL);
stats->starting_bit_rate = encoder->starting_bit_rate;
- stats->cur_bit_rate = mjpeg_encoder_get_bit_rate(encoder);
+ stats->cur_bit_rate = mjpeg_encoder_get_bit_rate(video_encoder);
stats->avg_quality = (double)encoder->avg_quality / encoder->num_frames;
}

-MJpegEncoder *mjpeg_encoder_new(uint64_t starting_bit_rate,
- MJpegEncoderRateControlCbs *cbs)
+VideoEncoder *mjpeg_encoder_new(uint64_t starting_bit_rate,
+ VideoEncoderRateControlCbs *cbs)
{
MJpegEncoder *encoder = spice_new0(MJpegEncoder, 1);

+ encoder->base.destroy = mjpeg_encoder_destroy;
+ encoder->base.encode_frame = mjpeg_encoder_encode_frame;
+ encoder->base.client_stream_report = mjpeg_encoder_client_stream_report;
+ encoder->base.notify_server_frame_drop = mjpeg_encoder_notify_server_frame_drop;
+ encoder->base.get_bit_rate = mjpeg_encoder_get_bit_rate;
+ encoder->base.get_stats = mjpeg_encoder_get_stats;
encoder->first_frame = TRUE;
encoder->rate_control.byte_rate = starting_bit_rate / 8;
encoder->starting_bit_rate = starting_bit_rate;
@@ -1357,5 +1374,5 @@ MJpegEncoder *mjpeg_encoder_new(uint64_t starting_bit_rate,
encoder->cinfo.err = jpeg_std_error(&encoder->jerr);
jpeg_create_compress(&encoder->cinfo);

- return encoder;
+ return (VideoEncoder*)encoder;
}
diff --git a/server/mjpeg-encoder.h b/server/mjpeg-encoder.h
deleted file mode 100644
index 4d871ff..0000000
--- a/server/mjpeg-encoder.h
+++ /dev/null
@@ -1,102 +0,0 @@
-/* -*- Mode: C; c-basic-offset: 4; indent-tabs-mode: nil -*- */
-/*
- Copyright (C) 2009 Red Hat, Inc.
-
- This library is free software; you can redistribute it and/or
- modify it under the terms of the GNU Lesser General Public
- License as published by the Free Software Foundation; either
- version 2.1 of the License, or (at your option) any later version.
-
- This library is distributed in the hope that it will be useful,
- but WITHOUT ANY WARRANTY; without even the implied warranty of
- MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
- Lesser General Public License for more details.
-
- You should have received a copy of the GNU Lesser General Public
- License along with this library; if not, see <http://www.gnu.org/licenses/>.
-*/
-
-#ifndef _H_MJPEG_ENCODER
-#define _H_MJPEG_ENCODER
-
-#include "red-common.h"
-
-enum {
- MJPEG_ENCODER_FRAME_UNSUPPORTED = -1,
- MJPEG_ENCODER_FRAME_DROP,
- MJPEG_ENCODER_FRAME_ENCODE_DONE,
-};
-
-typedef struct MJpegEncoder MJpegEncoder;
-
-/*
- * Callbacks required for controling and adjusting
- * the stream bit rate:
- * @opaque: a pointer to be passed to the rate control callbacks.
- * get_roundtrip_ms: roundtrip time in milliseconds
- * get_source_fps: the input frame rate (#frames per second), i.e.,
- * the rate of frames arriving from the guest to spice-server,
- * before any drops.
- */
-typedef struct MJpegEncoderRateControlCbs {
- void *opaque;
- uint32_t (*get_roundtrip_ms)(void *opaque);
- uint32_t (*get_source_fps)(void *opaque);
- void (*update_client_playback_delay)(void *opaque, uint32_t delay_ms);
-} MJpegEncoderRateControlCbs;
-
-typedef struct MJpegEncoderStats {
- uint64_t starting_bit_rate;
- uint64_t cur_bit_rate;
- double avg_quality;
-} MJpegEncoderStats;
-
-MJpegEncoder *mjpeg_encoder_new(uint64_t starting_bit_rate,
- MJpegEncoderRateControlCbs *cbs);
-void mjpeg_encoder_destroy(MJpegEncoder *encoder);
-
-int mjpeg_encoder_encode_frame(MJpegEncoder *encoder,
- const SpiceBitmap *bitmap, int width, int height,
- const SpiceRect *src,
- int top_down, uint32_t frame_mm_time,
- uint8_t **outbuf, size_t *outbuf_size,
- int *data_size);
-
-/*
- * bit rate control
- */
-
-/*
- * Data that should be periodically obtained from the client. The report contains:
- * num_frames : the number of frames that reached the client during the time
- * the report is referring to.
- * num_drops : the part of the above frames that was dropped by the client due to
- * late arrival time.
- * start_frame_mm_time: the mm_time of the first frame included in the report
- * end_frame_mm_time : the mm_time of the last_frame included in the report
- * end_frame_delay : (end_frame_mm_time - client_mm_time)
- * audio delay : the latency of the audio playback.
- * If there is no audio playback, set it to MAX_UINT.
- *
- */
-void mjpeg_encoder_client_stream_report(MJpegEncoder *encoder,
- uint32_t num_frames,
- uint32_t num_drops,
- uint32_t start_frame_mm_time,
- uint32_t end_frame_mm_time,
- int32_t end_frame_delay,
- uint32_t audio_delay);
-
-/*
- * Notify the encoder each time a frame is dropped due to pipe
- * congestion.
- * We can deduce the client state by the frame dropping rate in the server.
- * Monitoring the frame drops can help in fine tuning the playback parameters
- * when the client reports are delayed.
- */
-void mjpeg_encoder_notify_server_frame_drop(MJpegEncoder *encoder);
-
-uint64_t mjpeg_encoder_get_bit_rate(MJpegEncoder *encoder);
-void mjpeg_encoder_get_stats(MJpegEncoder *encoder, MJpegEncoderStats *stats);
-
-#endif
diff --git a/server/stream.c b/server/stream.c
index 548c4c7..89fb13e 100644
--- a/server/stream.c
+++ b/server/stream.c
@@ -32,10 +32,10 @@ void stream_agent_stats_print(StreamAgent *agent)
#ifdef STREAM_STATS
StreamStats *stats = &agent->stats;
double passed_mm_time = (stats->end - stats->start) / 1000.0;
- MJpegEncoderStats encoder_stats = {0};
+ VideoEncoderStats encoder_stats = {0};

- if (agent->mjpeg_encoder) {
- mjpeg_encoder_get_stats(agent->mjpeg_encoder, &encoder_stats);
+ if (agent->video_encoder) {
+ agent->video_encoder->get_stats(agent->video_encoder, &encoder_stats);
}

spice_debug("stream=%p dim=(%dx%d) #in-frames=%"PRIu64" #in-avg-fps=%.2f #out-frames=%"PRIu64" "
@@ -79,8 +79,8 @@ void stream_stop(DisplayChannel *display, Stream *stream)
region_clear(&stream_agent->vis_region);
region_clear(&stream_agent->clip);
spice_assert(!pipe_item_is_linked(&stream_agent->destroy_item));
- if (stream_agent->mjpeg_encoder && dcc->use_mjpeg_encoder_rate_control) {
- uint64_t stream_bit_rate = mjpeg_encoder_get_bit_rate(stream_agent->mjpeg_encoder);
+ if (stream_agent->video_encoder && dcc->use_video_encoder_rate_control) {
+ uint64_t stream_bit_rate = stream_agent->video_encoder->get_bit_rate(stream_agent->video_encoder);

if (stream_bit_rate > dcc->streams_max_bit_rate) {
spice_debug("old max-bit-rate=%.2f new=%.2f",
@@ -338,7 +338,7 @@ static void before_reattach_stream(DisplayChannel *display,
dcc = dpi->dcc;
agent = &dcc->stream_agents[index];

- if (!dcc->use_mjpeg_encoder_rate_control &&
+ if (!dcc->use_video_encoder_rate_control &&
!dcc->common.is_low_bandwidth) {
continue;
}
@@ -347,8 +347,8 @@ static void before_reattach_stream(DisplayChannel *display,
#ifdef STREAM_STATS
agent->stats.num_drops_pipe++;
#endif
- if (dcc->use_mjpeg_encoder_rate_control) {
- mjpeg_encoder_notify_server_frame_drop(agent->mjpeg_encoder);
+ if (dcc->use_video_encoder_rate_control) {
+ agent->video_encoder->notify_server_frame_drop(agent->video_encoder);
} else {
++agent->drops;
}
@@ -361,7 +361,7 @@ static void before_reattach_stream(DisplayChannel *display,

agent = &dcc->stream_agents[index];

- if (dcc->use_mjpeg_encoder_rate_control) {
+ if (dcc->use_video_encoder_rate_control) {
continue;
}
if (agent->frames / agent->fps < FPS_TEST_INTERVAL) {
@@ -594,7 +594,7 @@ static void dcc_update_streams_max_latency(DisplayChannelClient *dcc, StreamAgen
}
for (i = 0; i < NUM_STREAMS; i++) {
StreamAgent *other_agent = &dcc->stream_agents[i];
- if (other_agent == remove_agent || !other_agent->mjpeg_encoder) {
+ if (other_agent == remove_agent || !other_agent->video_encoder) {
continue;
}
if (other_agent->client_required_latency > new_max_latency) {
@@ -714,19 +714,19 @@ void dcc_create_stream(DisplayChannelClient *dcc, Stream *stream)
agent->fps = MAX_FPS;
agent->dcc = dcc;

- if (dcc->use_mjpeg_encoder_rate_control) {
- MJpegEncoderRateControlCbs mjpeg_cbs;
+ if (dcc->use_video_encoder_rate_control) {
+ VideoEncoderRateControlCbs video_cbs;
uint64_t initial_bit_rate;

- mjpeg_cbs.opaque = agent;
- mjpeg_cbs.get_roundtrip_ms = get_roundtrip_ms;
- mjpeg_cbs.get_source_fps = get_source_fps;
- mjpeg_cbs.update_client_playback_delay = update_client_playback_delay;
+ video_cbs.opaque = agent;
+ video_cbs.get_roundtrip_ms = get_roundtrip_ms;
+ video_cbs.get_source_fps = get_source_fps;
+ video_cbs.update_client_playback_delay = update_client_playback_delay;

initial_bit_rate = get_initial_bit_rate(dcc, stream);
- agent->mjpeg_encoder = mjpeg_encoder_new(initial_bit_rate, &mjpeg_cbs);
+ agent->video_encoder = mjpeg_encoder_new(initial_bit_rate, &video_cbs);
} else {
- agent->mjpeg_encoder = mjpeg_encoder_new(0, NULL);
+ agent->video_encoder = mjpeg_encoder_new(0, NULL);
}
red_channel_client_pipe_add(RED_CHANNEL_CLIENT(dcc), &agent->create_item);

@@ -752,9 +752,9 @@ void stream_agent_stop(StreamAgent *agent)
DisplayChannelClient *dcc = agent->dcc;

dcc_update_streams_max_latency(dcc, agent);
- if (agent->mjpeg_encoder) {
- mjpeg_encoder_destroy(agent->mjpeg_encoder);
- agent->mjpeg_encoder = NULL;
+ if (agent->video_encoder) {
+ agent->video_encoder->destroy(agent->video_encoder);
+ agent->video_encoder = NULL;
}
}

diff --git a/server/stream.h b/server/stream.h
index a3e84ed..59df9bd 100644
--- a/server/stream.h
+++ b/server/stream.h
@@ -20,7 +20,7 @@

#include <glib.h>
#include "utils.h"
-#include "mjpeg-encoder.h"
+#include "video-encoder.h"
#include "common/region.h"
#include "red-channel.h"
#include "image-cache.h"
@@ -85,7 +85,7 @@ typedef struct StreamAgent {
PipeItem destroy_item;
Stream *stream;
uint64_t last_send_time;
- MJpegEncoder *mjpeg_encoder;
+ VideoEncoder *video_encoder;
DisplayChannelClient *dcc;

int frames;
diff --git a/server/video-encoder.h b/server/video-encoder.h
new file mode 100644
index 0000000..2a857ba
--- /dev/null
+++ b/server/video-encoder.h
@@ -0,0 +1,160 @@
+/* -*- Mode: C; c-basic-offset: 4; indent-tabs-mode: nil -*- */
+/*
+ Copyright (C) 2009 Red Hat, Inc.
+ Copyright (C) 2015 Jeremy White
+ Copyright (C) 2015 Francois Gouget
+
+ This library is free software; you can redistribute it and/or
+ modify it under the terms of the GNU Lesser General Public
+ License as published by the Free Software Foundation; either
+ version 2.1 of the License, or (at your option) any later version.
+
+ This library is distributed in the hope that it will be useful,
+ but WITHOUT ANY WARRANTY; without even the implied warranty of
+ MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
+ Lesser General Public License for more details.
+
+ You should have received a copy of the GNU Lesser General Public
+ License along with this library; if not, see <http://www.gnu.org/licenses/>.
+*/
+
+#ifndef _H_VIDEO_ENCODER
+#define _H_VIDEO_ENCODER
+
+enum {
+ VIDEO_ENCODER_FRAME_UNSUPPORTED = -1,
+ VIDEO_ENCODER_FRAME_DROP,
+ VIDEO_ENCODER_FRAME_ENCODE_DONE,
+};
+
+typedef struct VideoEncoderStats {
+ uint64_t starting_bit_rate;
+ uint64_t cur_bit_rate;
+ double avg_quality;
+} VideoEncoderStats;
+
+typedef struct VideoEncoder VideoEncoder;
+struct VideoEncoder {
+ /* Releases the video encoder's resources */
+ void (*destroy)(VideoEncoder *encoder);
+
+ /* Compresses the specified src image area into the outbuf buffer.
+ *
+ * @encoder: The video encoder.
+ * @frame_mm_time: The frame's mm-time timestamp in milliseconds.
+ * @bitmap: The Spice screen.
+ * @src: A rectangle specifying the area occupied by the video.
+ * @top_down: If true the first video line is specified by src.top.
+ * @outbuf: The buffer for the compressed frame. This must either
+ * be NULL or point to a buffer allocated by malloc
+ * since it may be reallocated, if its size is too small.
+ * @outbuf_size: The size of the outbuf buffer.
+ * @data_size: The size of the compressed frame.
+ * @return:
+ * VIDEO_ENCODER_FRAME_ENCODE_DONE if successful.
+ * VIDEO_ENCODER_FRAME_UNSUPPORTED if the frame cannot be encoded.
+ * VIDEO_ENCODER_FRAME_DROP if the frame was dropped. This value can
+ * only happen if rate control is active.
+ */
+ int (*encode_frame)(VideoEncoder *encoder, uint32_t frame_mm_time,
+ const SpiceBitmap *bitmap, int width, int height,
+ const SpiceRect *src, int top_down,
+ uint8_t **outbuf, size_t *outbuf_size, int *data_size);
+
+ /*
+ * Bit rate control methods.
+ */
+
+ /* When rate control is active statistics are periodically obtained from
+ * the client and sent to the video encoder through this method.
+ *
+ * @encoder: The video encoder.
+ * @num_frames: The number of frames that reached the client during
+ * the time period the report is referring to.
+ * @num_drops: The part of the above frames that was dropped by the
+ * client due to late arrival time.
+ * @start_frame_mm_time: The mm_time of the first frame included in the
+ * report.
+ * @end_frame_mm_time: The mm_time of the last frame included in the
+ * report.
+ * @end_frame_delay: This indicates how long in advance the client
+ * received the last frame before having to display it.
+ * @audio delay: The latency of the audio playback or MAX_UINT if it
+ * is not tracked.
+ */
+ void (*client_stream_report)(VideoEncoder *encoder,
+ uint32_t num_frames, uint32_t num_drops,
+ uint32_t start_frame_mm_time,
+ uint32_t end_frame_mm_time,
+ int32_t end_frame_delay, uint32_t audio_delay);
+
+ /* This notifies the video encoder each time a frame is dropped due to
+ * pipe congestion.
+ *
+ * Note that frames are being dropped before they are encoded and that
+ * there may be any number of encoded frames in the network queue.
+ * The client reports provide richer and typically more reactive
+ * information for fine tuning the playback parameters but this function
+ * provides a fallback when client reports are getting delayed or are not
+ * supported by the client.
+ *
+ * @encoder: The video encoder.
+ */
+ void (*notify_server_frame_drop)(VideoEncoder *encoder);
+
+ /* This queries the video encoder's current bit rate.
+ *
+ * @encoder: The video encoder.
+ * @return: The current bit rate in bits per second.
+ */
+ uint64_t (*get_bit_rate)(VideoEncoder *encoder);
+
+ /* Collects video statistics.
+ *
+ * @encoder: The video encoder.
+ * @stats: A VideoEncoderStats structure to fill with the collected
+ * statistics.
+ */
+ void (*get_stats)(VideoEncoder *encoder, VideoEncoderStats *stats);
+};
+
+
+/* When rate control is active the video encoder can use these callbacks to
+ * figure out how to adjust the stream bit rate and adjust some stream
+ * parameters.
+ */
+typedef struct VideoEncoderRateControlCbs {
+ /* The opaque parameter for the callbacks */
+ void *opaque;
+
+ /* Returns the stream's estimated roundtrip time in milliseconds. */
+ uint32_t (*get_roundtrip_ms)(void *opaque);
+
+ /* Returns the estimated input frame rate.
+ *
+ * This is the number of frames per second arriving from the guest to
+ * spice-server, before any drops.
+ */
+ uint32_t (*get_source_fps)(void *opaque);
+
+ /* Informs the client of the minimum playback delay.
+ *
+ * @delay_ms: The minimum number of milliseconds required for the
+ * frames to reach the client.
+ */
+ void (*update_client_playback_delay)(void *opaque, uint32_t delay_ms);
+} VideoEncoderRateControlCbs;
+
+
+/* Instantiates the video encoder.
+ *
+ * @starting_bit_rate: An initial estimate of the available stream bit rate
+ * or zero if the client does not support rate control.
+ * @cbs: A set of callback methods to be used for rate control.
+ * @return: A pointer to a structure implementing the VideoEncoder
+ * methods.
+ */
+VideoEncoder* mjpeg_encoder_new(uint64_t starting_bit_rate,
+ VideoEncoderRateControlCbs *cbs);
+
+#endif
--
2.8.0.rc3
Francois Gouget
2016-04-05 15:15:09 UTC
Permalink
This introduces a pared down GStreamer-based video encoder to serve as
the basis for later enhancements.
In this form the new encoder supports both regular and sized streams
but lacks any rate control. It should still work fine if bandwidth is
sufficient such as on LANs.

Signed-off-by: Francois Gouget <***@codeweavers.com>
---
configure.ac | 23 ++
server/Makefile.am | 8 +
server/gstreamer-encoder.c | 541 +++++++++++++++++++++++++++++++++++++++++++++
server/stream.c | 18 +-
server/video-encoder.h | 4 +
5 files changed, 592 insertions(+), 2 deletions(-)
create mode 100644 server/gstreamer-encoder.c

diff --git a/configure.ac b/configure.ac
index 8419508..9904bc8 100644
--- a/configure.ac
+++ b/configure.ac
@@ -69,6 +69,28 @@ dnl =========================================================================
dnl Check optional features
SPICE_CHECK_SMARTCARD

+AC_ARG_ENABLE(gstreamer,
+ AS_HELP_STRING([--enable-gstreamer=@<:@auto/yes/no@:>@],
+ [Enable GStreamer 1.0 support]),,
+ [enable_gstreamer="auto"])
+
+if test "x$enable_gstreamer" != "xno"; then
+ SPICE_CHECK_GSTREAMER(GSTREAMER_1_0, 1.0, [gstreamer-1.0 gstreamer-base-1.0 gstreamer-app-1.0 gstreamer-video-1.0],
+ [enable_gstreamer="yes"
+ SPICE_CHECK_GSTREAMER_ELEMENTS($GST_INSPECT_1_0, [gst-plugins-base 1.0], [appsrc videoconvert appsink])
+ SPICE_CHECK_GSTREAMER_ELEMENTS($GST_INSPECT_1_0, [gstreamer-libav 1.0], [avenc_mjpeg])
+ ],
+ [if test "x$enable_gstreamer" = "xyes"; then
+ AC_MSG_ERROR([GStreamer 1.0 support requested but not found. You may set GSTREAMER_1_0_CFLAGS and GSTREAMER_1_0_LIBS to avoid the need to call pkg-config.])
+ fi
+ ])
+fi
+AM_CONDITIONAL(HAVE_GSTREAMER_1_0, test "x$have_gstreamer_1_0" = "xyes")
+
+if test x"$gstreamer_missing" != x; then
+ SPICE_WARNING([The following GStreamer $enable_gstreamer tools/elements are missing:$gstreamer_missing. The GStreamer video encoder can be built but may not work.])
+fi
+
AC_ARG_ENABLE([automated_tests],
AS_HELP_STRING([--enable-automated-tests], [Enable automated tests using spicy-screenshot (part of spice--gtk)]),,
[enable_automated_tests="no"])
@@ -243,6 +265,7 @@ AC_MSG_NOTICE([

LZ4 support: ${enable_lz4}
Smartcard: ${have_smartcard}
+ GStreamer 1.0: ${have_gstreamer_1_0}
SASL support: ${have_sasl}
Automated tests: ${enable_automated_tests}
Manual: ${have_asciidoc}
diff --git a/server/Makefile.am b/server/Makefile.am
index 8763f54..6149b7b 100644
--- a/server/Makefile.am
+++ b/server/Makefile.am
@@ -12,6 +12,7 @@ AM_CPPFLAGS = \
$(SASL_CFLAGS) \
$(SLIRP_CFLAGS) \
$(SMARTCARD_CFLAGS) \
+ $(GSTREAMER_1_0_CFLAGS) \
$(SPICE_PROTOCOL_CFLAGS) \
$(SSL_CFLAGS) \
$(VISIBILITY_HIDDEN_CFLAGS) \
@@ -45,6 +46,7 @@ libserver_la_LIBADD = \
$(PIXMAN_LIBS) \
$(SASL_LIBS) \
$(SLIRP_LIBS) \
+ $(GSTREAMER_1_0_LIBS) \
$(SSL_LIBS) \
$(Z_LIBS) \
$(SPICE_NONPKGCONFIG_LIBS) \
@@ -152,6 +154,12 @@ libserver_la_SOURCES += \
$(NULL)
endif

+if HAVE_GSTREAMER_1_0
+libserver_la_SOURCES += \
+ gstreamer-encoder.c \
+ $(NULL)
+endif
+
libspice_server_la_LIBADD = libserver.la
libspice_server_la_SOURCES =

diff --git a/server/gstreamer-encoder.c b/server/gstreamer-encoder.c
new file mode 100644
index 0000000..e5c044e
--- /dev/null
+++ b/server/gstreamer-encoder.c
@@ -0,0 +1,541 @@
+/* -*- Mode: C; c-basic-offset: 4; indent-tabs-mode: nil -*- */
+/*
+ Copyright (C) 2015 Jeremy White
+ Copyright (C) 2015 Francois Gouget
+
+ This library is free software; you can redistribute it and/or
+ modify it under the terms of the GNU Lesser General Public
+ License as published by the Free Software Foundation; either
+ version 2.1 of the License, or (at your option) any later version.
+
+ This library is distributed in the hope that it will be useful,
+ but WITHOUT ANY WARRANTY; without even the implied warranty of
+ MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
+ Lesser General Public License for more details.
+
+ You should have received a copy of the GNU Lesser General Public
+ License along with this library; if not, see <http://www.gnu.org/licenses/>.
+*/
+#ifdef HAVE_CONFIG_H
+#include <config.h>
+#endif
+
+#include <gst/gst.h>
+#include <gst/app/gstappsrc.h>
+#include <gst/app/gstappsink.h>
+
+#include "red-common.h"
+#include "video-encoder.h"
+
+
+#define SPICE_GST_DEFAULT_FPS 30
+
+
+typedef struct {
+ SpiceBitmapFmt spice_format;
+ const char *format;
+ uint32_t bpp;
+} SpiceFormatForGStreamer;
+
+typedef struct SpiceGstEncoder {
+ VideoEncoder base;
+
+ /* Rate control callbacks */
+ VideoEncoderRateControlCbs cbs;
+
+ /* Spice's initial bit rate estimation in bits per second. */
+ uint64_t starting_bit_rate;
+
+ /* ---------- Video characteristics ---------- */
+
+ uint32_t width;
+ uint32_t height;
+ const SpiceFormatForGStreamer *format;
+ SpiceBitmapFmt spice_format;
+
+ /* ---------- GStreamer pipeline ---------- */
+
+ /* Pointers to the GStreamer pipeline elements. If pipeline is NULL the
+ * other pointers are invalid.
+ */
+ GstElement *pipeline;
+ GstAppSrc *appsrc;
+ GstElement *gstenc;
+ GstAppSink *appsink;
+
+ /* If src_caps is NULL the pipeline has not been configured yet. */
+ GstCaps *src_caps;
+
+ /* The frame counter for GStreamer buffers */
+ uint32_t frame;
+
+ /* The bit rate target for the outgoing network stream. (bits per second) */
+ uint64_t bit_rate;
+
+ /* The minimum bit rate */
+# define SPICE_GST_MIN_BITRATE (128 * 1024)
+
+ /* The default bit rate */
+# define SPICE_GST_DEFAULT_BITRATE (8 * 1024 * 1024)
+} SpiceGstEncoder;
+
+
+/* ---------- Miscellaneous SpiceGstEncoder helpers ---------- */
+
+static inline double get_mbps(uint64_t bit_rate)
+{
+ return (double)bit_rate / 1024 / 1024;
+}
+
+/* Returns the source frame rate which may change at any time so don't store
+ * the result.
+ */
+static uint32_t get_source_fps(SpiceGstEncoder *encoder)
+{
+ return encoder->cbs.get_source_fps ?
+ encoder->cbs.get_source_fps(encoder->cbs.opaque) : SPICE_GST_DEFAULT_FPS;
+}
+
+static inline int is_pipeline_configured(SpiceGstEncoder *encoder)
+{
+ return encoder->src_caps != NULL;
+}
+
+static void free_pipeline(SpiceGstEncoder *encoder)
+{
+ if (encoder->src_caps) {
+ gst_caps_unref(encoder->src_caps);
+ encoder->src_caps = NULL;
+ }
+ if (encoder->pipeline) {
+ gst_element_set_state(encoder->pipeline, GST_STATE_NULL);
+ gst_object_unref(encoder->appsrc);
+ gst_object_unref(encoder->gstenc);
+ gst_object_unref(encoder->appsink);
+ gst_object_unref(encoder->pipeline);
+ encoder->pipeline = NULL;
+ }
+}
+
+/* The maximum bit rate we will use for the current video.
+ *
+ * This is based on a 10x compression ratio which should be more than enough
+ * for even MJPEG to provide good quality.
+ */
+static uint64_t get_bit_rate_cap(SpiceGstEncoder *encoder)
+{
+ uint32_t raw_frame_bits = encoder->width * encoder->height * encoder->format->bpp;
+ return raw_frame_bits * get_source_fps(encoder) / 10;
+}
+
+static void adjust_bit_rate(SpiceGstEncoder *encoder)
+{
+ if (encoder->bit_rate == 0) {
+ /* Use the default value, */
+ encoder->bit_rate = SPICE_GST_DEFAULT_BITRATE;
+ } else if (encoder->bit_rate < SPICE_GST_MIN_BITRATE) {
+ /* don't let the bit rate go too low */
+ encoder->bit_rate = SPICE_GST_MIN_BITRATE;
+ } else {
+ /* or too high */
+ encoder->bit_rate = MIN(encoder->bit_rate, get_bit_rate_cap(encoder));
+ }
+ spice_debug("adjust_bit_rate(%.3fMbps)", get_mbps(encoder->bit_rate));
+}
+
+
+/* ---------- GStreamer pipeline ---------- */
+
+/* A helper for spice_gst_encoder_encode_frame() */
+static const SpiceFormatForGStreamer *map_format(SpiceBitmapFmt format)
+{
+ /* See GStreamer's part-mediatype-video-raw.txt and
+ * section-types-definitions.html documents.
+ */
+ static const SpiceFormatForGStreamer format_map[] = {
+ {SPICE_BITMAP_FMT_RGBA, "BGRA", 32},
+ {SPICE_BITMAP_FMT_16BIT, "RGB15", 16},
+ /* TODO: Test the other formats */
+ {SPICE_BITMAP_FMT_32BIT, "BGRx", 32},
+ {SPICE_BITMAP_FMT_24BIT, "BGR", 24},
+ };
+
+ int i;
+ for (i = 0; i < G_N_ELEMENTS(format_map); i++) {
+ if (format_map[i].spice_format == format) {
+ if (i > 1) {
+ spice_warning("The %d format has not been tested yet", format);
+ }
+ return &format_map[i];
+ }
+ }
+
+ return NULL;
+}
+
+static void set_appsrc_caps(SpiceGstEncoder *encoder)
+{
+ if (encoder->src_caps) {
+ gst_caps_unref(encoder->src_caps);
+ }
+ encoder->src_caps = gst_caps_new_simple(
+ "video/x-raw",
+ "format", G_TYPE_STRING, encoder->format->format,
+ "width", G_TYPE_INT, encoder->width,
+ "height", G_TYPE_INT, encoder->height,
+ "framerate", GST_TYPE_FRACTION, get_source_fps(encoder), 1,
+ NULL);
+ g_object_set(G_OBJECT(encoder->appsrc), "caps", encoder->src_caps, NULL);
+}
+
+/* A helper for spice_gst_encoder_encode_frame() */
+static gboolean create_pipeline(SpiceGstEncoder *encoder)
+{
+ GError *err = NULL;
+ /* Set max-threads to ensure zero-frame latency */
+ const gchar *desc = "appsrc is-live=true format=time do-timestamp=true name=src ! videoconvert ! avenc_mjpeg max-threads=1 name=encoder ! appsink name=sink";
+ spice_debug("GStreamer pipeline: %s", desc);
+ encoder->pipeline = gst_parse_launch_full(desc, NULL, GST_PARSE_FLAG_FATAL_ERRORS, &err);
+ if (!encoder->pipeline || err) {
+ spice_warning("GStreamer error: %s", err->message);
+ g_clear_error(&err);
+ if (encoder->pipeline) {
+ gst_object_unref(encoder->pipeline);
+ encoder->pipeline = NULL;
+ }
+ return FALSE;
+ }
+ encoder->appsrc = GST_APP_SRC(gst_bin_get_by_name(GST_BIN(encoder->pipeline), "src"));
+ encoder->gstenc = gst_bin_get_by_name(GST_BIN(encoder->pipeline), "encoder");
+ encoder->appsink = GST_APP_SINK(gst_bin_get_by_name(GST_BIN(encoder->pipeline), "sink"));
+ return TRUE;
+}
+
+/* A helper for spice_gst_encoder_encode_frame() */
+static gboolean configure_pipeline(SpiceGstEncoder *encoder,
+ const SpiceBitmap *bitmap)
+{
+ if (!encoder->pipeline && !create_pipeline(encoder)) {
+ return FALSE;
+ }
+
+ /* Configure the encoder bitrate */
+ adjust_bit_rate(encoder);
+ g_object_set(G_OBJECT(encoder->gstenc),
+ "bitrate", (gint)encoder->bit_rate, NULL);
+
+ /* See https://bugzilla.gnome.org/show_bug.cgi?id=753257 */
+ spice_debug("removing the pipeline clock");
+ gst_pipeline_use_clock(GST_PIPELINE(encoder->pipeline), NULL);
+
+ /* Set the source caps */
+ set_appsrc_caps(encoder);
+
+ /* Start playing */
+ if (gst_element_set_state(encoder->pipeline, GST_STATE_PLAYING) == GST_STATE_CHANGE_FAILURE) {
+ spice_warning("GStreamer error: unable to set the pipeline to the playing state");
+ free_pipeline(encoder);
+ return FALSE;
+ }
+
+ return TRUE;
+}
+
+/* A helper for spice_gst_encoder_encode_frame() */
+static void reconfigure_pipeline(SpiceGstEncoder *encoder)
+{
+ if (!is_pipeline_configured(encoder)) {
+ return;
+ }
+ if (gst_element_set_state(encoder->pipeline, GST_STATE_PAUSED) == GST_STATE_CHANGE_FAILURE) {
+ spice_debug("GStreamer error: could not pause the pipeline, rebuilding it instead");
+ free_pipeline(encoder);
+ }
+ set_appsrc_caps(encoder);
+ if (gst_element_set_state(encoder->pipeline, GST_STATE_PLAYING) == GST_STATE_CHANGE_FAILURE) {
+ spice_debug("GStreamer error: could not restart the pipeline, rebuilding it instead");
+ free_pipeline(encoder);
+ }
+}
+
+/* A helper for the *_copy() functions */
+static int is_chunk_padded(const SpiceBitmap *bitmap, uint32_t index)
+{
+ SpiceChunks *chunks = bitmap->data;
+ if (chunks->chunk[index].len % bitmap->stride != 0) {
+ spice_warning("chunk %d/%d is padded, cannot copy", index, chunks->num_chunks);
+ return TRUE;
+ }
+ return FALSE;
+}
+
+/* A helper for push_raw_frame() */
+static inline int line_copy(SpiceGstEncoder *encoder, const SpiceBitmap *bitmap,
+ uint32_t chunk_offset, uint32_t stream_stride,
+ uint32_t height, uint8_t *buffer)
+{
+ uint8_t *dst = buffer;
+ SpiceChunks *chunks = bitmap->data;
+ uint32_t chunk_index = 0;
+ for (int l = 0; l < height; l++) {
+ /* We may have to move forward by more than one chunk the first
+ * time around.
+ */
+ while (chunk_offset >= chunks->chunk[chunk_index].len) {
+ if (is_chunk_padded(bitmap, chunk_index)) {
+ return FALSE;
+ }
+ chunk_offset -= chunks->chunk[chunk_index].len;
+ chunk_index++;
+ }
+
+ /* Copy the line */
+ uint8_t *src = chunks->chunk[chunk_index].data + chunk_offset;
+ memcpy(dst, src, stream_stride);
+ dst += stream_stride;
+ chunk_offset += bitmap->stride;
+ }
+ spice_return_val_if_fail(dst - buffer == stream_stride * height, FALSE);
+ return TRUE;
+}
+
+/* A helper for push_raw_frame() */
+static inline int chunk_copy(SpiceGstEncoder *encoder, const SpiceBitmap *bitmap,
+ uint32_t chunk_offset, uint32_t len, uint8_t *dst)
+{
+ SpiceChunks *chunks = bitmap->data;
+ uint32_t chunk_index = 0;
+ /* Skip chunks until we find the start of the frame */
+ while (chunk_index < chunks->num_chunks &&
+ chunk_offset >= chunks->chunk[chunk_index].len) {
+ if (is_chunk_padded(bitmap, chunk_index)) {
+ return FALSE;
+ }
+ chunk_offset -= chunks->chunk[chunk_index].len;
+ chunk_index++;
+ }
+
+ /* We can copy the frame chunk by chunk */
+ while (len && chunk_index < chunks->num_chunks) {
+ if (is_chunk_padded(bitmap, chunk_index)) {
+ return FALSE;
+ }
+ uint8_t *src = chunks->chunk[chunk_index].data + chunk_offset;
+ uint32_t thislen = MIN(chunks->chunk[chunk_index].len - chunk_offset, len);
+ memcpy(dst, src, thislen);
+ dst += thislen;
+ len -= thislen;
+ chunk_offset = 0;
+ chunk_index++;
+ }
+ spice_return_val_if_fail(len == 0, FALSE);
+ return TRUE;
+}
+
+/* A helper for spice_gst_encoder_encode_frame() */
+static int push_raw_frame(SpiceGstEncoder *encoder, const SpiceBitmap *bitmap,
+ const SpiceRect *src, int top_down)
+{
+ uint32_t height = src->bottom - src->top;
+ uint32_t stream_stride = (src->right - src->left) * encoder->format->bpp / 8;
+ uint32_t len = stream_stride * height;
+ GstBuffer *buffer = gst_buffer_new_and_alloc(len);
+ GstMapInfo map;
+ gst_buffer_map(buffer, &map, GST_MAP_WRITE);
+ uint8_t *dst = map.data;
+
+ /* Note that we should not reorder the lines, even if top_down is false.
+ * It just changes the number of lines to skip at the start of the bitmap.
+ */
+ uint32_t skip_lines = top_down ? src->top : bitmap->y - (src->bottom - 0);
+ uint32_t chunk_offset = bitmap->stride * skip_lines;
+
+ if (stream_stride != bitmap->stride) {
+ /* We have to do a line-by-line copy because for each we have to
+ * leave out pixels on the left or right.
+ */
+ chunk_offset += src->left * encoder->format->bpp / 8;
+ if (!line_copy(encoder, bitmap, chunk_offset, stream_stride, height, dst)) {
+ gst_buffer_unmap(buffer, &map);
+ gst_buffer_unref(buffer);
+ return VIDEO_ENCODER_FRAME_UNSUPPORTED;
+ }
+ } else {
+ if (!chunk_copy(encoder, bitmap, chunk_offset, len, dst)) {
+ gst_buffer_unmap(buffer, &map);
+ gst_buffer_unref(buffer);
+ return VIDEO_ENCODER_FRAME_UNSUPPORTED;
+ }
+ }
+ gst_buffer_unmap(buffer, &map);
+ GST_BUFFER_OFFSET(buffer) = encoder->frame++;
+
+ GstFlowReturn ret = gst_app_src_push_buffer(encoder->appsrc, buffer);
+ if (ret != GST_FLOW_OK) {
+ spice_warning("GStreamer error: unable to push source buffer (%d)", ret);
+ return VIDEO_ENCODER_FRAME_UNSUPPORTED;
+ }
+
+ return VIDEO_ENCODER_FRAME_ENCODE_DONE;
+}
+
+/* A helper for spice_gst_encoder_encode_frame() */
+static int pull_compressed_buffer(SpiceGstEncoder *encoder,
+ uint8_t **outbuf, size_t *outbuf_size,
+ int *data_size)
+{
+ spice_return_val_if_fail(outbuf && outbuf_size, VIDEO_ENCODER_FRAME_UNSUPPORTED);
+
+ GstSample *sample = gst_app_sink_pull_sample(encoder->appsink);
+ if (sample) {
+ GstMapInfo map;
+ GstBuffer *buffer = gst_sample_get_buffer(sample);
+ if (buffer && gst_buffer_map(buffer, &map, GST_MAP_READ)) {
+ int size = gst_buffer_get_size(buffer);
+ if (!*outbuf || *outbuf_size < size) {
+ free(*outbuf);
+ *outbuf = spice_malloc(size);
+ *outbuf_size = size;
+ }
+ /* TODO Try to avoid this copy by changing the GstBuffer handling */
+ memcpy(*outbuf, map.data, size);
+ *data_size = size;
+ gst_buffer_unmap(buffer, &map);
+ gst_sample_unref(sample);
+ return VIDEO_ENCODER_FRAME_ENCODE_DONE;
+ }
+ gst_sample_unref(sample);
+ }
+ spice_debug("failed to pull the compressed buffer");
+ return VIDEO_ENCODER_FRAME_UNSUPPORTED;
+}
+
+
+/* ---------- VideoEncoder's public API ---------- */
+
+static void spice_gst_encoder_destroy(VideoEncoder *video_encoder)
+{
+ SpiceGstEncoder *encoder = (SpiceGstEncoder*)video_encoder;
+ free_pipeline(encoder);
+ free(encoder);
+}
+
+static int spice_gst_encoder_encode_frame(VideoEncoder *video_encoder,
+ uint32_t frame_mm_time,
+ const SpiceBitmap *bitmap,
+ int width, int height,
+ const SpiceRect *src, int top_down,
+ uint8_t **outbuf, size_t *outbuf_size,
+ int *data_size)
+{
+ SpiceGstEncoder *encoder = (SpiceGstEncoder*)video_encoder;
+
+ if (width != encoder->width || height != encoder->height ||
+ encoder->spice_format != bitmap->format) {
+ spice_debug("video format change: width %d -> %d, height %d -> %d, format %d -> %d",
+ encoder->width, width, encoder->height, height,
+ encoder->spice_format, bitmap->format);
+ encoder->format = map_format(bitmap->format);
+ if (!encoder->format) {
+ spice_warning("unable to map format type %d", bitmap->format);
+ return VIDEO_ENCODER_FRAME_UNSUPPORTED;
+ }
+ encoder->spice_format = bitmap->format;
+ encoder->width = width;
+ encoder->height = height;
+ if (encoder->pipeline) {
+ reconfigure_pipeline(encoder);
+ }
+ }
+ if (!is_pipeline_configured(encoder) &&
+ !configure_pipeline(encoder, bitmap)) {
+ return VIDEO_ENCODER_FRAME_UNSUPPORTED;
+ }
+
+ int rc = push_raw_frame(encoder, bitmap, src, top_down);
+ if (rc == VIDEO_ENCODER_FRAME_ENCODE_DONE) {
+ rc = pull_compressed_buffer(encoder, outbuf, outbuf_size, data_size);
+ if (rc != VIDEO_ENCODER_FRAME_ENCODE_DONE) {
+ /* The input buffer will be stuck in the pipeline, preventing
+ * later ones from being processed. So reset the pipeline.
+ */
+ free_pipeline(encoder);
+ }
+ }
+ return rc;
+}
+
+static void spice_gst_encoder_client_stream_report(VideoEncoder *video_encoder,
+ uint32_t num_frames,
+ uint32_t num_drops,
+ uint32_t start_frame_mm_time,
+ uint32_t end_frame_mm_time,
+ int32_t end_frame_delay,
+ uint32_t audio_delay)
+{
+ spice_debug("client report: #frames %u, #drops %d, duration %u video-delay %d audio-delay %u",
+ num_frames, num_drops,
+ end_frame_mm_time - start_frame_mm_time,
+ end_frame_delay, audio_delay);
+}
+
+static void spice_gst_encoder_notify_server_frame_drop(VideoEncoder *video_encoder)
+{
+ spice_debug("server report: getting frame drops...");
+}
+
+static uint64_t spice_gst_encoder_get_bit_rate(VideoEncoder *video_encoder)
+{
+ SpiceGstEncoder *encoder = (SpiceGstEncoder*)video_encoder;
+ return encoder->bit_rate;
+}
+
+static void spice_gst_encoder_get_stats(VideoEncoder *video_encoder,
+ VideoEncoderStats *stats)
+{
+ SpiceGstEncoder *encoder = (SpiceGstEncoder*)video_encoder;
+ uint64_t raw_bit_rate = encoder->width * encoder->height * (encoder->format ? encoder->format->bpp : 0) * get_source_fps(encoder);
+
+ spice_return_if_fail(stats != NULL);
+ stats->starting_bit_rate = encoder->starting_bit_rate;
+ stats->cur_bit_rate = encoder->bit_rate;
+
+ /* Use the compression level as a proxy for the quality */
+ stats->avg_quality = stats->cur_bit_rate ? 100.0 - raw_bit_rate / stats->cur_bit_rate : 0;
+ if (stats->avg_quality < 0) {
+ stats->avg_quality = 0;
+ }
+}
+
+VideoEncoder *gstreamer_encoder_new(uint64_t starting_bit_rate,
+ VideoEncoderRateControlCbs *cbs)
+{
+ GError *err = NULL;
+ if (!gst_init_check(NULL, NULL, &err)) {
+ spice_warning("GStreamer error: %s", err->message);
+ g_clear_error(&err);
+ return NULL;
+ }
+
+ SpiceGstEncoder *encoder = spice_new0(SpiceGstEncoder, 1);
+ encoder->base.destroy = spice_gst_encoder_destroy;
+ encoder->base.encode_frame = spice_gst_encoder_encode_frame;
+ encoder->base.client_stream_report = spice_gst_encoder_client_stream_report;
+ encoder->base.notify_server_frame_drop = spice_gst_encoder_notify_server_frame_drop;
+ encoder->base.get_bit_rate = spice_gst_encoder_get_bit_rate;
+ encoder->base.get_stats = spice_gst_encoder_get_stats;
+
+ if (cbs) {
+ encoder->cbs = *cbs;
+ }
+ encoder->starting_bit_rate = starting_bit_rate;
+
+ /* All the other fields are initialized to zero by spice_new0(). */
+
+ if (!create_pipeline(encoder)) {
+ /* Some GStreamer dependency is probably missing */
+ free(encoder);
+ encoder = NULL;
+ }
+ return (VideoEncoder*)encoder;
+}
diff --git a/server/stream.c b/server/stream.c
index 89fb13e..05bfb84 100644
--- a/server/stream.c
+++ b/server/stream.c
@@ -696,6 +696,20 @@ static void update_client_playback_delay(void *opaque, uint32_t delay_ms)
agent->dcc->streams_max_latency);
}

+/* A helper for dcc_create_stream(). */
+static VideoEncoder* dcc_create_video_encoder(uint64_t starting_bit_rate,
+ VideoEncoderRateControlCbs *cbs)
+{
+#ifdef HAVE_GSTREAMER_1_0
+ VideoEncoder* video_encoder = gstreamer_encoder_new(starting_bit_rate, cbs);
+ if (video_encoder) {
+ return video_encoder;
+ }
+#endif
+ /* Use the builtin MJPEG video encoder as a fallback */
+ return mjpeg_encoder_new(starting_bit_rate, cbs);
+}
+
void dcc_create_stream(DisplayChannelClient *dcc, Stream *stream)
{
StreamAgent *agent = &dcc->stream_agents[get_stream_id(DCC_TO_DC(dcc), stream)];
@@ -724,9 +738,9 @@ void dcc_create_stream(DisplayChannelClient *dcc, Stream *stream)
video_cbs.update_client_playback_delay = update_client_playback_delay;

initial_bit_rate = get_initial_bit_rate(dcc, stream);
- agent->video_encoder = mjpeg_encoder_new(initial_bit_rate, &video_cbs);
+ agent->video_encoder = dcc_create_video_encoder(initial_bit_rate, &video_cbs);
} else {
- agent->video_encoder = mjpeg_encoder_new(0, NULL);
+ agent->video_encoder = dcc_create_video_encoder(0, NULL);
}
red_channel_client_pipe_add(RED_CHANNEL_CLIENT(dcc), &agent->create_item);

diff --git a/server/video-encoder.h b/server/video-encoder.h
index 2a857ba..104f3b5 100644
--- a/server/video-encoder.h
+++ b/server/video-encoder.h
@@ -156,5 +156,9 @@ typedef struct VideoEncoderRateControlCbs {
*/
VideoEncoder* mjpeg_encoder_new(uint64_t starting_bit_rate,
VideoEncoderRateControlCbs *cbs);
+#ifdef HAVE_GSTREAMER_1_0
+VideoEncoder* gstreamer_encoder_new(uint64_t starting_bit_rate,
+ VideoEncoderRateControlCbs *cbs);
+#endif

#endif
--
2.8.0.rc3
Francois Gouget
2016-04-05 15:15:16 UTC
Permalink
Signed-off-by: Francois Gouget <***@codeweavers.com>
---
server/dcc-send.c | 3 ++-
server/stream.c | 25 ++++++++++++++++---------
2 files changed, 18 insertions(+), 10 deletions(-)

diff --git a/server/dcc-send.c b/server/dcc-send.c
index 2fd4129..ebbc3e5 100644
--- a/server/dcc-send.c
+++ b/server/dcc-send.c
@@ -1706,7 +1706,8 @@ static int red_marshall_stream_data(RedChannelClient *rcc,
drawable->red_drawable->mm_time :
reds_get_mm_time();
outbuf_size = dcc->send_data.stream_outbuf_size;
- ret = agent->video_encoder->encode_frame(agent->video_encoder,
+ ret = !agent->video_encoder ? VIDEO_ENCODER_FRAME_UNSUPPORTED :
+ agent->video_encoder->encode_frame(agent->video_encoder,
frame_mm_time,
&image->u.bitmap, width, height,
&drawable->red_drawable->u.copy.src_area,
diff --git a/server/stream.c b/server/stream.c
index 05bfb84..64d4a90 100644
--- a/server/stream.c
+++ b/server/stream.c
@@ -697,17 +697,24 @@ static void update_client_playback_delay(void *opaque, uint32_t delay_ms)
}

/* A helper for dcc_create_stream(). */
-static VideoEncoder* dcc_create_video_encoder(uint64_t starting_bit_rate,
+static VideoEncoder* dcc_create_video_encoder(DisplayChannelClient *dcc,
+ uint64_t starting_bit_rate,
VideoEncoderRateControlCbs *cbs)
{
+ RedChannelClient *rcc = RED_CHANNEL_CLIENT(dcc);
+ int client_has_multi_codec = red_channel_client_test_remote_cap(rcc, SPICE_DISPLAY_CAP_MULTI_CODEC);
+ if (!client_has_multi_codec || red_channel_client_test_remote_cap(rcc, SPICE_DISPLAY_CAP_CODEC_MJPEG)) {
#ifdef HAVE_GSTREAMER_1_0
- VideoEncoder* video_encoder = gstreamer_encoder_new(starting_bit_rate, cbs);
- if (video_encoder) {
- return video_encoder;
- }
+ VideoEncoder* video_encoder = gstreamer_encoder_new(starting_bit_rate, cbs);
+ if (video_encoder) {
+ return video_encoder;
+ }
#endif
- /* Use the builtin MJPEG video encoder as a fallback */
- return mjpeg_encoder_new(starting_bit_rate, cbs);
+ /* Use the builtin MJPEG video encoder as a fallback */
+ return mjpeg_encoder_new(starting_bit_rate, cbs);
+ }
+
+ return NULL;
}

void dcc_create_stream(DisplayChannelClient *dcc, Stream *stream)
@@ -738,9 +745,9 @@ void dcc_create_stream(DisplayChannelClient *dcc, Stream *stream)
video_cbs.update_client_playback_delay = update_client_playback_delay;

initial_bit_rate = get_initial_bit_rate(dcc, stream);
- agent->video_encoder = dcc_create_video_encoder(initial_bit_rate, &video_cbs);
+ agent->video_encoder = dcc_create_video_encoder(dcc, initial_bit_rate, &video_cbs);
} else {
- agent->video_encoder = dcc_create_video_encoder(0, NULL);
+ agent->video_encoder = dcc_create_video_encoder(dcc, 0, NULL);
}
red_channel_client_pipe_add(RED_CHANNEL_CLIENT(dcc), &agent->create_item);
--
2.8.0.rc3
Francois Gouget
2016-04-05 15:15:23 UTC
Permalink
The Spice server administrator can specify the encoder and codec
preferences to optimize for CPU or bandwidth usage. Preferences are
described in a semi-colon separated list of encoder:codec pairs.
The server has a default preference list which can explicitly be
selected by specifying 'auto'.

The server then picks a codec supported by the client based on the
following new client capabilities:
* SPICE_DISPLAY_CAP_MULTI_CODEC which denotes a recent client that
supports multiple codecs. This capability is needed to not have to
hardcode that MJPEG is supported. This makes it possible to write
clients that don't support MJPEG.
* SPICE_DISPLAY_CAP_CODEC_XXX, where XXX is a supported codec. Note
that for now the server only supports the MJPEG codec.

Signed-off-by: Francois Gouget <***@codeweavers.com>
---
server/dcc-send.c | 2 +-
server/display-channel.c | 10 +++
server/display-channel.h | 4 ++
server/gstreamer-encoder.c | 6 +-
server/mjpeg-encoder.c | 6 +-
server/red-qxl.c | 9 +++
server/red-qxl.h | 6 ++
server/red-worker.c | 18 +++++-
server/reds-private.h | 1 +
server/reds.c | 155 ++++++++++++++++++++++++++++++++++++++++-----
server/reds.h | 1 +
server/spice-server.h | 8 +++
server/spice-server.syms | 5 ++
server/stream.c | 29 +++++++--
server/video-encoder.h | 20 +++++-
15 files changed, 251 insertions(+), 29 deletions(-)

diff --git a/server/dcc-send.c b/server/dcc-send.c
index ebbc3e5..91d6352 100644
--- a/server/dcc-send.c
+++ b/server/dcc-send.c
@@ -2151,7 +2151,7 @@ static void marshall_stream_start(RedChannelClient *rcc,
stream_create.surface_id = 0;
stream_create.id = get_stream_id(DCC_TO_DC(dcc), stream);
stream_create.flags = stream->top_down ? SPICE_STREAM_FLAGS_TOP_DOWN : 0;
- stream_create.codec_type = SPICE_VIDEO_CODEC_TYPE_MJPEG;
+ stream_create.codec_type = agent->video_encoder->codec_type;

stream_create.src_width = stream->width;
stream_create.src_height = stream->height;
diff --git a/server/display-channel.c b/server/display-channel.c
index a6d90cf..5a0c6fb 100644
--- a/server/display-channel.c
+++ b/server/display-channel.c
@@ -230,6 +230,14 @@ void display_channel_set_stream_video(DisplayChannel *display, int stream_video)
display->stream_video = stream_video;
}

+void display_channel_set_video_codecs(DisplayChannel *display, GArray *video_codecs)
+{
+ spice_return_if_fail(display);
+
+ g_array_unref(display->video_codecs);
+ display->video_codecs = g_array_ref(video_codecs);
+}
+
static void stop_streams(DisplayChannel *display)
{
Ring *ring = &display->streams;
@@ -2029,6 +2037,7 @@ static SpiceCanvas *image_surfaces_get(SpiceImageSurfaces *surfaces, uint32_t su

DisplayChannel* display_channel_new(SpiceServer *reds, RedWorker *worker,
int migrate, int stream_video,
+ GArray *video_codecs,
uint32_t n_surfaces)
{
DisplayChannel *display;
@@ -2082,6 +2091,7 @@ DisplayChannel* display_channel_new(SpiceServer *reds, RedWorker *worker,
drawables_init(display);
image_cache_init(&display->image_cache);
display->stream_video = stream_video;
+ display->video_codecs = g_array_ref(video_codecs);
display_channel_init_streams(display);

return display;
diff --git a/server/display-channel.h b/server/display-channel.h
index 6b053de..cf3709a 100644
--- a/server/display-channel.h
+++ b/server/display-channel.h
@@ -183,6 +183,7 @@ struct DisplayChannel {
uint32_t glz_drawable_count;

int stream_video;
+ GArray *video_codecs;
uint32_t stream_count;
Stream streams_buf[NUM_STREAMS];
Stream *free_streams;
@@ -254,6 +255,7 @@ DisplayChannel* display_channel_new (SpiceServe
RedWorker *worker,
int migrate,
int stream_video,
+ GArray *video_codecs,
uint32_t n_surfaces);
void display_channel_create_surface (DisplayChannel *display, uint32_t surface_id,
uint32_t width, uint32_t height,
@@ -275,6 +277,8 @@ void display_channel_update (DisplayCha
void display_channel_free_some (DisplayChannel *display);
void display_channel_set_stream_video (DisplayChannel *display,
int stream_video);
+void display_channel_set_video_codecs (DisplayChannel *display,
+ GArray *video_codecs);
int display_channel_get_streams_timeout (DisplayChannel *display);
void display_channel_compress_stats_print (const DisplayChannel *display);
void display_channel_compress_stats_reset (DisplayChannel *display);
diff --git a/server/gstreamer-encoder.c b/server/gstreamer-encoder.c
index e5c044e..d6eb1eb 100644
--- a/server/gstreamer-encoder.c
+++ b/server/gstreamer-encoder.c
@@ -507,9 +507,12 @@ static void spice_gst_encoder_get_stats(VideoEncoder *video_encoder,
}
}

-VideoEncoder *gstreamer_encoder_new(uint64_t starting_bit_rate,
+VideoEncoder *gstreamer_encoder_new(SpiceVideoCodecType codec_type,
+ uint64_t starting_bit_rate,
VideoEncoderRateControlCbs *cbs)
{
+ spice_return_val_if_fail(codec_type == SPICE_VIDEO_CODEC_TYPE_MJPEG, NULL);
+
GError *err = NULL;
if (!gst_init_check(NULL, NULL, &err)) {
spice_warning("GStreamer error: %s", err->message);
@@ -524,6 +527,7 @@ VideoEncoder *gstreamer_encoder_new(uint64_t starting_bit_rate,
encoder->base.notify_server_frame_drop = spice_gst_encoder_notify_server_frame_drop;
encoder->base.get_bit_rate = spice_gst_encoder_get_bit_rate;
encoder->base.get_stats = spice_gst_encoder_get_stats;
+ encoder->base.codec_type = codec_type;

if (cbs) {
encoder->cbs = *cbs;
diff --git a/server/mjpeg-encoder.c b/server/mjpeg-encoder.c
index 5ec0753..41237a7 100644
--- a/server/mjpeg-encoder.c
+++ b/server/mjpeg-encoder.c
@@ -1344,17 +1344,21 @@ static void mjpeg_encoder_get_stats(VideoEncoder *video_encoder,
stats->avg_quality = (double)encoder->avg_quality / encoder->num_frames;
}

-VideoEncoder *mjpeg_encoder_new(uint64_t starting_bit_rate,
+VideoEncoder *mjpeg_encoder_new(SpiceVideoCodecType codec_type,
+ uint64_t starting_bit_rate,
VideoEncoderRateControlCbs *cbs)
{
MJpegEncoder *encoder = spice_new0(MJpegEncoder, 1);

+ spice_return_val_if_fail(codec_type == SPICE_VIDEO_CODEC_TYPE_MJPEG, NULL);
+
encoder->base.destroy = mjpeg_encoder_destroy;
encoder->base.encode_frame = mjpeg_encoder_encode_frame;
encoder->base.client_stream_report = mjpeg_encoder_client_stream_report;
encoder->base.notify_server_frame_drop = mjpeg_encoder_notify_server_frame_drop;
encoder->base.get_bit_rate = mjpeg_encoder_get_bit_rate;
encoder->base.get_stats = mjpeg_encoder_get_stats;
+ encoder->base.codec_type = codec_type;
encoder->first_frame = TRUE;
encoder->rate_control.byte_rate = starting_bit_rate / 8;
encoder->starting_bit_rate = starting_bit_rate;
diff --git a/server/red-qxl.c b/server/red-qxl.c
index abde0ba..55a77ef 100644
--- a/server/red-qxl.c
+++ b/server/red-qxl.c
@@ -1045,6 +1045,15 @@ void red_qxl_on_sv_change(QXLInstance *qxl, int sv)
&payload);
}

+void red_qxl_on_vc_change(QXLInstance *qxl, GArray *video_codecs)
+{
+ RedWorkerMessageSetVideoCodecs payload;
+ payload.video_codecs = g_array_ref(video_codecs);
+ dispatcher_send_message(qxl->st->dispatcher,
+ RED_WORKER_MESSAGE_SET_VIDEO_CODECS,
+ &payload);
+}
+
void red_qxl_set_mouse_mode(QXLInstance *qxl, uint32_t mode)
{
RedWorkerMessageSetMouseMode payload;
diff --git a/server/red-qxl.h b/server/red-qxl.h
index b1ebbe1..2dcfbb7 100644
--- a/server/red-qxl.h
+++ b/server/red-qxl.h
@@ -28,6 +28,7 @@ void red_qxl_init(SpiceServer *reds, QXLInstance *qxl);

void red_qxl_on_ic_change(QXLInstance *qxl, SpiceImageCompression ic);
void red_qxl_on_sv_change(QXLInstance *qxl, int sv);
+void red_qxl_on_vc_change(QXLInstance *qxl, GArray* video_codecs);
void red_qxl_set_mouse_mode(QXLInstance *qxl, uint32_t mode);
void red_qxl_attach_worker(QXLInstance *qxl);
void red_qxl_set_compression_level(QXLInstance *qxl, int level);
@@ -114,6 +115,7 @@ enum {
RED_WORKER_MESSAGE_DRIVER_UNLOAD,
RED_WORKER_MESSAGE_GL_SCANOUT,
RED_WORKER_MESSAGE_GL_DRAW_ASYNC,
+ RED_WORKER_MESSAGE_SET_VIDEO_CODECS,

RED_WORKER_MESSAGE_COUNT // LAST
};
@@ -251,6 +253,10 @@ typedef struct RedWorkerMessageSetStreamingVideo {
uint32_t streaming_video;
} RedWorkerMessageSetStreamingVideo;

+typedef struct RedWorkerMessageSetVideoCodecs {
+ GArray* video_codecs;
+} RedWorkerMessageSetVideoCodecs;
+
typedef struct RedWorkerMessageSetMouseMode {
uint32_t mode;
} RedWorkerMessageSetMouseMode;
diff --git a/server/red-worker.c b/server/red-worker.c
index 241c300..1992b20 100644
--- a/server/red-worker.c
+++ b/server/red-worker.c
@@ -1076,6 +1076,15 @@ static void handle_dev_set_streaming_video(void *opaque, void *payload)
display_channel_set_stream_video(worker->display_channel, msg->streaming_video);
}

+void handle_dev_set_video_codecs(void *opaque, void *payload)
+{
+ RedWorkerMessageSetVideoCodecs *msg = payload;
+ RedWorker *worker = opaque;
+
+ display_channel_set_video_codecs(worker->display_channel, msg->video_codecs);
+ g_array_unref(msg->video_codecs);
+}
+
static void handle_dev_set_mouse_mode(void *opaque, void *payload)
{
RedWorkerMessageSetMouseMode *msg = payload;
@@ -1349,6 +1358,11 @@ static void register_callbacks(Dispatcher *dispatcher)
sizeof(RedWorkerMessageSetStreamingVideo),
DISPATCHER_NONE);
dispatcher_register_handler(dispatcher,
+ RED_WORKER_MESSAGE_SET_VIDEO_CODECS,
+ handle_dev_set_video_codecs,
+ sizeof(RedWorkerMessageSetVideoCodecs),
+ DISPATCHER_NONE);
+ dispatcher_register_handler(dispatcher,
RED_WORKER_MESSAGE_SET_MOUSE_MODE,
handle_dev_set_mouse_mode,
sizeof(RedWorkerMessageSetMouseMode),
@@ -1541,7 +1555,9 @@ RedWorker* red_worker_new(QXLInstance *qxl,
reds_register_channel(reds, channel);

// TODO: handle seemless migration. Temp, setting migrate to FALSE
- worker->display_channel = display_channel_new(reds, worker, FALSE, reds_get_streaming_video(reds),
+ worker->display_channel = display_channel_new(reds, worker, FALSE,
+ reds_get_streaming_video(reds),
+ reds_get_video_codecs(reds),
init_info.n_surfaces);

channel = RED_CHANNEL(worker->display_channel);
diff --git a/server/reds-private.h b/server/reds-private.h
index 8842aad..7a69aed 100644
--- a/server/reds-private.h
+++ b/server/reds-private.h
@@ -170,6 +170,7 @@ struct RedsState {

gboolean ticketing_enabled;
uint32_t streaming_video;
+ GArray* video_codecs;
SpiceImageCompression image_compression;
spice_wan_compression_t jpeg_state;
spice_wan_compression_t zlib_glz_state;
diff --git a/server/reds.c b/server/reds.c
index 23df51a..a270fa6 100644
--- a/server/reds.c
+++ b/server/reds.c
@@ -71,6 +71,7 @@
#include "utils.h"

#include "reds-private.h"
+#include "video-encoder.h"

static SpiceCoreInterface *core_public = NULL;

@@ -231,6 +232,7 @@ static void reds_remove_char_device(RedsState *reds, RedCharDevice *dev);
static void reds_send_mm_time(RedsState *reds);
static void reds_on_ic_change(RedsState *reds);
static void reds_on_sv_change(RedsState *reds);
+static void reds_on_vc_change(RedsState *reds);
static void reds_on_vm_stop(RedsState *reds);
static void reds_on_vm_start(RedsState *reds);
static void reds_set_mouse_mode(RedsState *reds, uint32_t mode);
@@ -3480,6 +3482,7 @@ err:
}

static const char default_renderer[] = "sw";
+static const char default_video_codecs[] = "spice:mjpeg;gstreamer:mjpeg";

/* new interface */
SPICE_GNUC_VISIBLE SpiceServer *spice_server_new(void)
@@ -3500,6 +3503,7 @@ SPICE_GNUC_VISIBLE SpiceServer *spice_server_new(void)
memset(reds->spice_uuid, 0, sizeof(reds->spice_uuid));
reds->ticketing_enabled = TRUE; /* ticketing enabled by default */
reds->streaming_video = SPICE_STREAM_VIDEO_FILTER;
+ reds->video_codecs = g_array_new(FALSE, FALSE, sizeof(RedVideoCodec));
reds->image_compression = SPICE_IMAGE_COMPRESSION_AUTO_GLZ;
reds->jpeg_state = SPICE_WAN_COMPRESSION_AUTO;
reds->zlib_glz_state = SPICE_WAN_COMPRESSION_AUTO;
@@ -3510,37 +3514,129 @@ SPICE_GNUC_VISIBLE SpiceServer *spice_server_new(void)
return reds;
}

-typedef struct RendererInfo {
- int id;
+typedef struct {
+ uint32_t id;
const char *name;
-} RendererInfo;
+} EnumNames;

-static const RendererInfo renderers_info[] = {
+static gboolean get_name_index(const EnumNames names[], const char *name, uint32_t *index)
+{
+ if (name) {
+ int i;
+ for (i = 0; names[i].name; i++) {
+ if (strcmp(name, names[i].name) == 0) {
+ *index = i;
+ return TRUE;
+ }
+ }
+ }
+ return FALSE;
+}
+
+static const EnumNames renderer_names[] = {
{RED_RENDERER_SW, "sw"},
{RED_RENDERER_INVALID, NULL},
};

-static const RendererInfo *find_renderer(const char *name)
+static gboolean reds_add_renderer(RedsState *reds, const char *name)
+{
+ uint32_t index;
+
+ if (reds->renderers->len == RED_RENDERER_LAST ||
+ !get_name_index(renderer_names, name, &index)) {
+ return FALSE;
+ }
+ g_array_append_val(reds->renderers, renderer_names[index].id);
+ return TRUE;
+}
+
+static const EnumNames video_encoder_names[] = {
+ {0, "spice"},
+ {1, "gstreamer"},
+ {0, NULL},
+};
+
+static new_video_encoder_t video_encoder_procs[] = {
+ &mjpeg_encoder_new,
+#ifdef HAVE_GSTREAMER_1_0
+ &gstreamer_encoder_new,
+#else
+ NULL,
+#endif
+};
+
+static const EnumNames video_codec_names[] = {
+ {SPICE_VIDEO_CODEC_TYPE_MJPEG, "mjpeg"},
+ {0, NULL},
+};
+
+static int video_codec_caps[] = {
+ SPICE_DISPLAY_CAP_CODEC_MJPEG,
+};
+
+
+/* Expected string: encoder:codec;encoder:codec */
+static const char* parse_video_codecs(const char *codecs, char **encoder,
+ char **codec)
{
- const RendererInfo *inf = renderers_info;
- while (inf->name) {
- if (strcmp(name, inf->name) == 0) {
- return inf;
+ if (!codecs) {
+ return NULL;
+ }
+ while (*codecs == ';') {
+ codecs++;
+ }
+ if (!*codecs) {
+ return NULL;
+ }
+ int n;
+ *encoder = *codec = NULL;
+ if (sscanf(codecs, "%m[0-9a-zA-Z_]:%m[0-9a-zA-Z_]%n", encoder, codec, &n) != 2) {
+ while (*codecs != '\0' && *codecs != ';') {
+ codecs++;
}
- inf++;
+ return codecs;
}
- return NULL;
+ return codecs + n;
}

-static int reds_add_renderer(RedsState *reds, const char *name)
+static void reds_set_video_codecs(RedsState *reds, const char *codecs)
{
- const RendererInfo *inf;
+ char *encoder_name, *codec_name;

- if (reds->renderers->len == RED_RENDERER_LAST || !(inf = find_renderer(name))) {
- return FALSE;
+ if (strcmp(codecs, "auto") == 0) {
+ codecs = default_video_codecs;
+ }
+
+ /* The video_codecs array is immutable */
+ g_array_unref(reds->video_codecs);
+ reds->video_codecs = g_array_new(FALSE, FALSE, sizeof(RedVideoCodec));
+ const char *c = codecs;
+ while ( (c = parse_video_codecs(c, &encoder_name, &codec_name)) ) {
+ uint32_t encoder_index, codec_index;
+ if (!encoder_name || !codec_name) {
+ spice_warning("spice: invalid encoder:codec value at %s", codecs);
+
+ } else if (!get_name_index(video_encoder_names, encoder_name, &encoder_index)){
+ spice_warning("spice: unknown video encoder %s", encoder_name);
+
+ } else if (!get_name_index(video_codec_names, codec_name, &codec_index)) {
+ spice_warning("spice: unknown video codec %s", codec_name);
+
+ } else if (!video_encoder_procs[encoder_index]) {
+ spice_warning("spice: unsupported video encoder %s", encoder_name);
+
+ } else {
+ RedVideoCodec new_codec;
+ new_codec.create = video_encoder_procs[encoder_index];
+ new_codec.type = video_codec_names[codec_index].id;
+ new_codec.cap = video_codec_caps[codec_index];
+ g_array_append_val(reds->video_codecs, new_codec);
+ }
+
+ free(encoder_name);
+ free(codec_name);
+ codecs = c;
}
- g_array_append_val(reds->renderers, inf->id);
- return TRUE;
}

SPICE_GNUC_VISIBLE int spice_server_init(SpiceServer *reds, SpiceCoreInterface *core)
@@ -3551,12 +3647,16 @@ SPICE_GNUC_VISIBLE int spice_server_init(SpiceServer *reds, SpiceCoreInterface *
if (reds->renderers->len == 0) {
reds_add_renderer(reds, default_renderer);
}
+ if (reds->video_codecs->len == 0) {
+ reds_set_video_codecs(reds, default_video_codecs);
+ }
return ret;
}

SPICE_GNUC_VISIBLE void spice_server_destroy(SpiceServer *reds)
{
g_array_unref(reds->renderers);
+ g_array_unref(reds->video_codecs);
if (reds->main_channel) {
main_channel_close(reds->main_channel);
}
@@ -3857,6 +3957,18 @@ uint32_t reds_get_streaming_video(const RedsState *reds)
return reds->streaming_video;
}

+SPICE_GNUC_VISIBLE int spice_server_set_video_codecs(SpiceServer *reds, const char *video_codecs)
+{
+ reds_set_video_codecs(reds, video_codecs);
+ reds_on_vc_change(reds);
+ return 0;
+}
+
+GArray* reds_get_video_codecs(const RedsState *reds)
+{
+ return reds->video_codecs;
+}
+
SPICE_GNUC_VISIBLE int spice_server_set_playback_compression(SpiceServer *reds, int enable)
{
snd_set_playback_compression(enable);
@@ -4244,6 +4356,15 @@ void reds_on_sv_change(RedsState *reds)
}
}

+void reds_on_vc_change(RedsState *reds)
+{
+ GList *l;
+
+ for (l = reds->qxl_instances; l != NULL; l = l->next) {
+ red_qxl_on_vc_change(l->data, reds_get_video_codecs(reds));
+ }
+}
+
void reds_on_vm_stop(RedsState *reds)
{
GList *l;
diff --git a/server/reds.h b/server/reds.h
index 83618e9..cfd5723 100644
--- a/server/reds.h
+++ b/server/reds.h
@@ -106,6 +106,7 @@ void reds_on_char_device_state_destroy(RedsState *reds, RedCharDevice *dev);

void reds_set_client_mm_time_latency(RedsState *reds, RedClient *client, uint32_t latency);
uint32_t reds_get_streaming_video(const RedsState *reds);
+GArray* reds_get_video_codecs(const RedsState *reds);
spice_wan_compression_t reds_get_jpeg_state(const RedsState *reds);
spice_wan_compression_t reds_get_zlib_glz_state(const RedsState *reds);
SpiceCoreInterfaceInternal* reds_get_core_interface(RedsState *reds);
diff --git a/server/spice-server.h b/server/spice-server.h
index c2ff61d..10b50a2 100644
--- a/server/spice-server.h
+++ b/server/spice-server.h
@@ -107,6 +107,14 @@ enum {
};

int spice_server_set_streaming_video(SpiceServer *s, int value);
+
+enum {
+ SPICE_STREAMING_INVALID,
+ SPICE_STREAMING_SPICE,
+ SPICE_STREAMING_GSTREAMER
+};
+
+int spice_server_set_video_codecs(SpiceServer *s, const char* video_codecs);
int spice_server_set_playback_compression(SpiceServer *s, int enable);
int spice_server_set_agent_mouse(SpiceServer *s, int enable);
int spice_server_set_agent_copypaste(SpiceServer *s, int enable);
diff --git a/server/spice-server.syms b/server/spice-server.syms
index 5c3e53c..edf04a4 100644
--- a/server/spice-server.syms
+++ b/server/spice-server.syms
@@ -168,3 +168,8 @@ global:
spice_qxl_gl_scanout;
spice_qxl_gl_draw_async;
} SPICE_SERVER_0.12.6;
+
+SPICE_SERVER_0.13.2 {
+global:
+ spice_server_set_video_codecs;
+} SPICE_SERVER_0.13.1;
diff --git a/server/stream.c b/server/stream.c
index 64d4a90..0aa10c3 100644
--- a/server/stream.c
+++ b/server/stream.c
@@ -701,17 +701,34 @@ static VideoEncoder* dcc_create_video_encoder(DisplayChannelClient *dcc,
uint64_t starting_bit_rate,
VideoEncoderRateControlCbs *cbs)
{
+ DisplayChannel *display = DCC_TO_DC(dcc);
RedChannelClient *rcc = RED_CHANNEL_CLIENT(dcc);
int client_has_multi_codec = red_channel_client_test_remote_cap(rcc, SPICE_DISPLAY_CAP_MULTI_CODEC);
- if (!client_has_multi_codec || red_channel_client_test_remote_cap(rcc, SPICE_DISPLAY_CAP_CODEC_MJPEG)) {
-#ifdef HAVE_GSTREAMER_1_0
- VideoEncoder* video_encoder = gstreamer_encoder_new(starting_bit_rate, cbs);
+ int i;
+
+ for (i = 0; i < display->video_codecs->len; i++) {
+ RedVideoCodec* video_codec = &g_array_index (display->video_codecs, RedVideoCodec, i);
+
+ if (!client_has_multi_codec &&
+ video_codec->type != SPICE_VIDEO_CODEC_TYPE_MJPEG) {
+ /* Old clients only support MJPEG */
+ continue;
+ }
+ if (client_has_multi_codec &&
+ !red_channel_client_test_remote_cap(rcc, video_codec->cap)) {
+ /* The client is recent but does not support this codec */
+ continue;
+ }
+
+ VideoEncoder* video_encoder = video_codec->create(video_codec->type, starting_bit_rate, cbs);
if (video_encoder) {
return video_encoder;
}
-#endif
- /* Use the builtin MJPEG video encoder as a fallback */
- return mjpeg_encoder_new(starting_bit_rate, cbs);
+ }
+
+ /* Try to use the builtin MJPEG video encoder as a fallback */
+ if (!client_has_multi_codec || red_channel_client_test_remote_cap(rcc, SPICE_DISPLAY_CAP_CODEC_MJPEG)) {
+ return mjpeg_encoder_new(SPICE_VIDEO_CODEC_TYPE_MJPEG, starting_bit_rate, cbs);
}

return NULL;
diff --git a/server/video-encoder.h b/server/video-encoder.h
index 104f3b5..8f3807b 100644
--- a/server/video-encoder.h
+++ b/server/video-encoder.h
@@ -116,6 +116,9 @@ struct VideoEncoder {
* statistics.
*/
void (*get_stats)(VideoEncoder *encoder, VideoEncoderStats *stats);
+
+ /* The codec being used by the video encoder */
+ SpiceVideoCodecType codec_type;
};


@@ -148,17 +151,30 @@ typedef struct VideoEncoderRateControlCbs {

/* Instantiates the video encoder.
*
+ * @codec_type: The codec to use.
* @starting_bit_rate: An initial estimate of the available stream bit rate
* or zero if the client does not support rate control.
* @cbs: A set of callback methods to be used for rate control.
* @return: A pointer to a structure implementing the VideoEncoder
* methods.
*/
-VideoEncoder* mjpeg_encoder_new(uint64_t starting_bit_rate,
+typedef VideoEncoder* (*new_video_encoder_t)(SpiceVideoCodecType codec_type,
+ uint64_t starting_bit_rate,
+ VideoEncoderRateControlCbs *cbs);
+
+VideoEncoder* mjpeg_encoder_new(SpiceVideoCodecType codec_type,
+ uint64_t starting_bit_rate,
VideoEncoderRateControlCbs *cbs);
#ifdef HAVE_GSTREAMER_1_0
-VideoEncoder* gstreamer_encoder_new(uint64_t starting_bit_rate,
+VideoEncoder* gstreamer_encoder_new(SpiceVideoCodecType codec_type,
+ uint64_t starting_bit_rate,
VideoEncoderRateControlCbs *cbs);
#endif

+typedef struct RedVideoCodec {
+ new_video_encoder_t create;
+ SpiceVideoCodecType type;
+ uint32_t cap;
+} RedVideoCodec;
+
#endif
--
2.8.0.rc3
Pavel Grunt
2016-04-05 15:15:27 UTC
Permalink
---
server/tests/replay.c | 11 ++++++++++-
1 file changed, 10 insertions(+), 1 deletion(-)

diff --git a/server/tests/replay.c b/server/tests/replay.c
index 7e4659b..025b20c 100644
--- a/server/tests/replay.c
+++ b/server/tests/replay.c
@@ -290,7 +290,7 @@ int main(int argc, char **argv)
{
GError *error = NULL;
GOptionContext *context = NULL;
- gchar *client = NULL, **file = NULL;
+ gchar *client = NULL, *codecs = NULL, **file = NULL;
gint port = 5000, compression = SPICE_IMAGE_COMPRESSION_AUTO_GLZ;
gint streaming = SPICE_STREAM_VIDEO_FILTER;
gboolean wait = FALSE;
@@ -300,6 +300,7 @@ int main(int argc, char **argv)
{ "client", 'c', 0, G_OPTION_ARG_STRING, &client, "Client", "CMD" },
{ "compression", 'C', 0, G_OPTION_ARG_INT, &compression, "Compression (default 2)", "INT" },
{ "streaming", 'S', 0, G_OPTION_ARG_INT, &streaming, "Streaming (default 3)", "INT" },
+ { "video-codecs", 'v', 0, G_OPTION_ARG_STRING, &codecs, "Video codecs", "STRING" },
{ "port", 'p', 0, G_OPTION_ARG_INT, &port, "Server port (default 5000)", "PORT" },
{ "wait", 'w', 0, G_OPTION_ARG_NONE, &wait, "Wait for client", NULL },
{ "slow", 's', 0, G_OPTION_ARG_INT, &slow, "Slow down replay. Delays USEC microseconds before each command", "USEC" },
@@ -386,6 +387,14 @@ int main(int argc, char **argv)
server = spice_server_new();
spice_server_set_image_compression(server, compression);
spice_server_set_streaming_video(server, streaming);
+
+ if (codecs != NULL) {
+ if (spice_server_set_video_codecs(server, codecs) != 0) {
+ g_warning("could not set codecs: %s", codecs);
+ }
+ g_free(codecs);
+ }
+
spice_server_set_port(server, port);
spice_server_set_noauth(server);
--
2.8.0.rc3
Francois Gouget
2016-04-05 15:15:31 UTC
Permalink
Signed-off-by: Francois Gouget <***@codeweavers.com>
---
configure.ac | 1 +
server/gstreamer-encoder.c | 74 ++++++++++++++++++++++++++++++++++++++++------
server/reds.c | 4 ++-
3 files changed, 69 insertions(+), 10 deletions(-)

diff --git a/configure.ac b/configure.ac
index 9904bc8..9bd8ccb 100644
--- a/configure.ac
+++ b/configure.ac
@@ -79,6 +79,7 @@ if test "x$enable_gstreamer" != "xno"; then
[enable_gstreamer="yes"
SPICE_CHECK_GSTREAMER_ELEMENTS($GST_INSPECT_1_0, [gst-plugins-base 1.0], [appsrc videoconvert appsink])
SPICE_CHECK_GSTREAMER_ELEMENTS($GST_INSPECT_1_0, [gstreamer-libav 1.0], [avenc_mjpeg])
+ SPICE_CHECK_GSTREAMER_ELEMENTS($GST_INSPECT_1_0, [gst-plugins-good 1.0], [vp8enc])
],
[if test "x$enable_gstreamer" = "xyes"; then
AC_MSG_ERROR([GStreamer 1.0 support requested but not found. You may set GSTREAMER_1_0_CFLAGS and GSTREAMER_1_0_LIBS to avoid the need to call pkg-config.])
diff --git a/server/gstreamer-encoder.c b/server/gstreamer-encoder.c
index d6eb1eb..5e6e52c 100644
--- a/server/gstreamer-encoder.c
+++ b/server/gstreamer-encoder.c
@@ -191,11 +191,42 @@ static void set_appsrc_caps(SpiceGstEncoder *encoder)
/* A helper for spice_gst_encoder_encode_frame() */
static gboolean create_pipeline(SpiceGstEncoder *encoder)
{
+ gchar *gstenc;
+ switch (encoder->base.codec_type)
+ {
+ case SPICE_VIDEO_CODEC_TYPE_MJPEG:
+ /* Set max-threads to ensure zero-frame latency */
+ gstenc = g_strdup("avenc_mjpeg max-threads=1");
+ break;
+ case SPICE_VIDEO_CODEC_TYPE_VP8: {
+ /* See http://www.webmproject.org/docs/encoder-parameters/
+ * - Set end-usage to get a constant bitrate to help with streaming.
+ * - resize-allowed allows trading resolution for low bitrates while
+ * min-quantizer ensures the bitrate does not get needlessly high.
+ * - error-resilient minimises artifacts in case the client drops a
+ * frame.
+ * - Set lag-in-frames, deadline and cpu-used to match
+ * "Profile Realtime". lag-in-frames ensures zero-frame latency,
+ * deadline turns on realtime behavior, and cpu-used targets a 75%
+ * CPU usage.
+ * - deadline is supposed to be set in microseconds but in practice
+ * it behaves like a boolean.
+ */
+ gstenc = g_strdup_printf("vp8enc end-usage=cbr min-quantizer=10 resize-allowed=true error-resilient=true lag-in-frames=0 deadline=1 cpu-used=4");
+ break;
+ }
+ default:
+ /* gstreamer_encoder_new() should have rejected this codec type */
+ spice_warning("unsupported codec type %d", encoder->base.codec_type);
+ return FALSE;
+ }
+
GError *err = NULL;
- /* Set max-threads to ensure zero-frame latency */
- const gchar *desc = "appsrc is-live=true format=time do-timestamp=true name=src ! videoconvert ! avenc_mjpeg max-threads=1 name=encoder ! appsink name=sink";
+ gchar *desc = g_strdup_printf("appsrc is-live=true format=time do-timestamp=true name=src ! videoconvert ! %s name=encoder ! appsink name=sink", gstenc);
spice_debug("GStreamer pipeline: %s", desc);
encoder->pipeline = gst_parse_launch_full(desc, NULL, GST_PARSE_FLAG_FATAL_ERRORS, &err);
+ g_free(gstenc);
+ g_free(desc);
if (!encoder->pipeline || err) {
spice_warning("GStreamer error: %s", err->message);
g_clear_error(&err);
@@ -221,12 +252,27 @@ static gboolean configure_pipeline(SpiceGstEncoder *encoder,

/* Configure the encoder bitrate */
adjust_bit_rate(encoder);
- g_object_set(G_OBJECT(encoder->gstenc),
- "bitrate", (gint)encoder->bit_rate, NULL);
-
- /* See https://bugzilla.gnome.org/show_bug.cgi?id=753257 */
- spice_debug("removing the pipeline clock");
- gst_pipeline_use_clock(GST_PIPELINE(encoder->pipeline), NULL);
+ switch (encoder->base.codec_type)
+ {
+ case SPICE_VIDEO_CODEC_TYPE_MJPEG:
+ g_object_set(G_OBJECT(encoder->gstenc),
+ "bitrate", (gint)encoder->bit_rate,
+ NULL);
+ /* See https://bugzilla.gnome.org/show_bug.cgi?id=753257 */
+ spice_debug("removing the pipeline clock");
+ gst_pipeline_use_clock(GST_PIPELINE(encoder->pipeline), NULL);
+ break;
+ case SPICE_VIDEO_CODEC_TYPE_VP8:
+ g_object_set(G_OBJECT(encoder->gstenc),
+ "target-bitrate", (gint)encoder->bit_rate,
+ NULL);
+ break;
+ default:
+ /* gstreamer_encoder_new() should have rejected this codec type */
+ spice_warning("unsupported codec type %d", encoder->base.codec_type);
+ free_pipeline(encoder);
+ return FALSE;
+ }

/* Set the source caps */
set_appsrc_caps(encoder);
@@ -247,6 +293,15 @@ static void reconfigure_pipeline(SpiceGstEncoder *encoder)
if (!is_pipeline_configured(encoder)) {
return;
}
+ if (encoder->base.codec_type == SPICE_VIDEO_CODEC_TYPE_VP8) {
+ /* vp8enc fails to account for caps changes that modify the frame
+ * size and complains about the buffer size.
+ * So recreate the pipeline from scratch.
+ */
+ free_pipeline(encoder);
+ return;
+ }
+
if (gst_element_set_state(encoder->pipeline, GST_STATE_PAUSED) == GST_STATE_CHANGE_FAILURE) {
spice_debug("GStreamer error: could not pause the pipeline, rebuilding it instead");
free_pipeline(encoder);
@@ -511,7 +566,8 @@ VideoEncoder *gstreamer_encoder_new(SpiceVideoCodecType codec_type,
uint64_t starting_bit_rate,
VideoEncoderRateControlCbs *cbs)
{
- spice_return_val_if_fail(codec_type == SPICE_VIDEO_CODEC_TYPE_MJPEG, NULL);
+ spice_return_val_if_fail(codec_type == SPICE_VIDEO_CODEC_TYPE_MJPEG ||
+ codec_type == SPICE_VIDEO_CODEC_TYPE_VP8, NULL);

GError *err = NULL;
if (!gst_init_check(NULL, NULL, &err)) {
diff --git a/server/reds.c b/server/reds.c
index a270fa6..cffed46 100644
--- a/server/reds.c
+++ b/server/reds.c
@@ -3482,7 +3482,7 @@ err:
}

static const char default_renderer[] = "sw";
-static const char default_video_codecs[] = "spice:mjpeg;gstreamer:mjpeg";
+static const char default_video_codecs[] = "spice:mjpeg;gstreamer:mjpeg;gstreamer:vp8";

/* new interface */
SPICE_GNUC_VISIBLE SpiceServer *spice_server_new(void)
@@ -3567,11 +3567,13 @@ static new_video_encoder_t video_encoder_procs[] = {

static const EnumNames video_codec_names[] = {
{SPICE_VIDEO_CODEC_TYPE_MJPEG, "mjpeg"},
+ {SPICE_VIDEO_CODEC_TYPE_VP8, "vp8"},
{0, NULL},
};

static int video_codec_caps[] = {
SPICE_DISPLAY_CAP_CODEC_MJPEG,
+ SPICE_DISPLAY_CAP_CODEC_VP8,
};
--
2.8.0.rc3
Francois Gouget
2016-04-05 15:15:39 UTC
Permalink
Signed-off-by: Francois Gouget <***@codeweavers.com>
---
configure.ac | 4 ++++
server/gstreamer-encoder.c | 27 ++++++++++++++++++++++++++-
2 files changed, 30 insertions(+), 1 deletion(-)

diff --git a/configure.ac b/configure.ac
index 9bd8ccb..6093431 100644
--- a/configure.ac
+++ b/configure.ac
@@ -129,6 +129,10 @@ AC_SUBST([SPICE_PROTOCOL_MIN_VER])
PKG_CHECK_MODULES([GLIB2], [glib-2.0 >= 2.22])
AS_VAR_APPEND([SPICE_REQUIRES], [" glib-2.0 >= 2.22"])

+AC_CHECK_LIB(glib-2.0, g_get_num_processors,
+ AC_DEFINE([HAVE_G_GET_NUMPROCESSORS], 1, [Defined if we have g_get_num_processors()]),,
+ $GLIB2_LIBS)
+
PKG_CHECK_MODULES([GOBJECT2], [gobject-2.0 >= 2.22])
AS_VAR_APPEND([SPICE_REQUIRES], [" gobject-2.0 >= 2.22"])

diff --git a/server/gstreamer-encoder.c b/server/gstreamer-encoder.c
index 5e6e52c..09e3678 100644
--- a/server/gstreamer-encoder.c
+++ b/server/gstreamer-encoder.c
@@ -188,6 +188,25 @@ static void set_appsrc_caps(SpiceGstEncoder *encoder)
g_object_set(G_OBJECT(encoder->appsrc), "caps", encoder->src_caps, NULL);
}

+static int physical_core_count = 0;
+static int get_physical_core_count(void)
+{
+ if (!physical_core_count) {
+#ifdef HAVE_G_GET_NUMPROCESSORS
+ physical_core_count = g_get_num_processors();
+#endif
+ if (system("egrep -l '^flags\\b.*: .*\\bht\\b' /proc/cpuinfo >/dev/null 2>&1") == 0) {
+ /* Hyperthreading is enabled so divide by two to get the number
+ * of physical cores.
+ */
+ physical_core_count = physical_core_count / 2;
+ }
+ if (physical_core_count == 0)
+ physical_core_count = 1;
+ }
+ return physical_core_count;
+}
+
/* A helper for spice_gst_encoder_encode_frame() */
static gboolean create_pipeline(SpiceGstEncoder *encoder)
{
@@ -211,8 +230,14 @@ static gboolean create_pipeline(SpiceGstEncoder *encoder)
* CPU usage.
* - deadline is supposed to be set in microseconds but in practice
* it behaves like a boolean.
+ * - At least up to GStreamer 1.6.2, vp8enc cannot be trusted to pick
+ * the optimal number of threads. Also exceeding the number of
+ * physical core really degrades image quality.
+ * - token-partitions parallelizes more operations.
*/
- gstenc = g_strdup_printf("vp8enc end-usage=cbr min-quantizer=10 resize-allowed=true error-resilient=true lag-in-frames=0 deadline=1 cpu-used=4");
+ int threads = get_physical_core_count();
+ int parts = threads < 2 ? 0 : threads < 4 ? 1 : threads < 8 ? 2 : 3;
+ gstenc = g_strdup_printf("vp8enc end-usage=cbr min-quantizer=10 resize-allowed=true error-resilient=true lag-in-frames=0 deadline=1 cpu-used=4 threads=%d token-partitions=%d", threads, parts);
break;
}
default:
--
2.8.0.rc3
Francois Gouget
2016-04-05 15:15:44 UTC
Permalink
This way the video encoder is not forced to use malloc()/free().
This also allows more flexibility in how the video encoder manages the
buffer which allows for a zero-copy implementation in both video
encoders.

Signed-off-by: Francois Gouget <***@codeweavers.com>
---
server/dcc-send.c | 25 +++++++-------
server/dcc.c | 5 ---
server/dcc.h | 3 --
server/gstreamer-encoder.c | 59 +++++++++++++++++++++-----------
server/mjpeg-encoder.c | 85 +++++++++++++++++++++++++++++-----------------
server/video-encoder.h | 26 ++++++++++----
6 files changed, 126 insertions(+), 77 deletions(-)

diff --git a/server/dcc-send.c b/server/dcc-send.c
index 91d6352..3be443e 100644
--- a/server/dcc-send.c
+++ b/server/dcc-send.c
@@ -1649,6 +1649,12 @@ static void red_lossy_marshall_qxl_draw_text(RedChannelClient *rcc,
}
}

+static void red_release_video_encoder_buffer(uint8_t *data, void *opaque)
+{
+ VideoBuffer *buffer = (VideoBuffer*)opaque;
+ buffer->free(buffer);
+}
+
static int red_marshall_stream_data(RedChannelClient *rcc,
SpiceMarshaller *base_marshaller, Drawable *drawable)
{
@@ -1657,7 +1663,6 @@ static int red_marshall_stream_data(RedChannelClient *rcc,
Stream *stream = drawable->stream;
SpiceImage *image;
uint32_t frame_mm_time;
- int n;
int width, height;
int ret;

@@ -1689,7 +1694,6 @@ static int red_marshall_stream_data(RedChannelClient *rcc,

StreamAgent *agent = &dcc->stream_agents[get_stream_id(display, stream)];
uint64_t time_now = spice_get_monotonic_time_ns();
- size_t outbuf_size;

if (!dcc->use_video_encoder_rate_control) {
if (time_now - agent->last_send_time < (1000 * 1000 * 1000) / agent->fps) {
@@ -1701,19 +1705,17 @@ static int red_marshall_stream_data(RedChannelClient *rcc,
}
}

+ VideoBuffer *outbuf;
/* workaround for vga streams */
frame_mm_time = drawable->red_drawable->mm_time ?
drawable->red_drawable->mm_time :
reds_get_mm_time();
- outbuf_size = dcc->send_data.stream_outbuf_size;
ret = !agent->video_encoder ? VIDEO_ENCODER_FRAME_UNSUPPORTED :
agent->video_encoder->encode_frame(agent->video_encoder,
frame_mm_time,
&image->u.bitmap, width, height,
&drawable->red_drawable->u.copy.src_area,
- stream->top_down,
- &dcc->send_data.stream_outbuf,
- &outbuf_size, &n);
+ stream->top_down, &outbuf);
switch (ret) {
case VIDEO_ENCODER_FRAME_DROP:
spice_assert(dcc->use_video_encoder_rate_control);
@@ -1729,7 +1731,6 @@ static int red_marshall_stream_data(RedChannelClient *rcc,
spice_error("bad return value (%d) from VideoEncoder::encode_frame", ret);
return FALSE;
}
- dcc->send_data.stream_outbuf_size = outbuf_size;

if (!drawable->sized_stream) {
SpiceMsgDisplayStreamData stream_data;
@@ -1738,7 +1739,7 @@ static int red_marshall_stream_data(RedChannelClient *rcc,

stream_data.base.id = get_stream_id(display, stream);
stream_data.base.multi_media_time = frame_mm_time;
- stream_data.data_size = n;
+ stream_data.data_size = outbuf->size;

spice_marshall_msg_display_stream_data(base_marshaller, &stream_data);
} else {
@@ -1748,7 +1749,7 @@ static int red_marshall_stream_data(RedChannelClient *rcc,

stream_data.base.id = get_stream_id(display, stream);
stream_data.base.multi_media_time = frame_mm_time;
- stream_data.data_size = n;
+ stream_data.data_size = outbuf->size;
stream_data.width = width;
stream_data.height = height;
stream_data.dest = drawable->red_drawable->bbox;
@@ -1757,12 +1758,12 @@ static int red_marshall_stream_data(RedChannelClient *rcc,
rect_debug(&stream_data.dest);
spice_marshall_msg_display_stream_data_sized(base_marshaller, &stream_data);
}
- spice_marshaller_add_ref(base_marshaller,
- dcc->send_data.stream_outbuf, n);
+ spice_marshaller_add_ref_full(base_marshaller, outbuf->data, outbuf->size,
+ &red_release_video_encoder_buffer, outbuf);
agent->last_send_time = time_now;
#ifdef STREAM_STATS
agent->stats.num_frames_sent++;
- agent->stats.size_sent += n;
+ agent->stats.size_sent += outbuf->size;
agent->stats.end = frame_mm_time;
#endif

diff --git a/server/dcc.c b/server/dcc.c
index 99b2540..d94d960 100644
--- a/server/dcc.c
+++ b/server/dcc.c
@@ -381,10 +381,6 @@ DisplayChannelClient *dcc_new(DisplayChannel *display,
// TODO: tune quality according to bandwidth
dcc->jpeg_quality = 85;

- size_t stream_buf_size;
- stream_buf_size = 32*1024;
- dcc->send_data.stream_outbuf = spice_malloc(stream_buf_size);
- dcc->send_data.stream_outbuf_size = stream_buf_size;
dcc->send_data.free_list.res =
spice_malloc(sizeof(SpiceResourceList) +
DISPLAY_FREE_LIST_DEFAULT_SIZE * sizeof(SpiceResourceID));
@@ -492,7 +488,6 @@ void dcc_stop(DisplayChannelClient *dcc)
dcc->pixmap_cache = NULL;
dcc_release_glz(dcc);
dcc_palette_cache_reset(dcc);
- free(dcc->send_data.stream_outbuf);
free(dcc->send_data.free_list.res);
dcc_destroy_stream_agents(dcc);
dcc_encoders_free(dcc);
diff --git a/server/dcc.h b/server/dcc.h
index 436d0be..502b0eb 100644
--- a/server/dcc.h
+++ b/server/dcc.h
@@ -88,9 +88,6 @@ struct DisplayChannelClient {
uint32_t palette_cache_items;

struct {
- uint32_t stream_outbuf_size;
- uint8_t *stream_outbuf; // caution stream buffer is also used as compress bufs!!!
-
FreeList free_list;
uint64_t pixmap_cache_items[MAX_DRAWABLE_PIXMAP_CACHE_ITEMS];
int num_pixmap_cache_items;
diff --git a/server/gstreamer-encoder.c b/server/gstreamer-encoder.c
index 09e3678..fc04f78 100644
--- a/server/gstreamer-encoder.c
+++ b/server/gstreamer-encoder.c
@@ -37,6 +37,12 @@ typedef struct {
uint32_t bpp;
} SpiceFormatForGStreamer;

+typedef struct SpiceGstVideoBuffer {
+ VideoBuffer base;
+ GstBuffer *gst_buffer;
+ GstMapInfo map;
+} SpiceGstVideoBuffer;
+
typedef struct SpiceGstEncoder {
VideoEncoder base;

@@ -80,6 +86,25 @@ typedef struct SpiceGstEncoder {
} SpiceGstEncoder;


+/* ---------- The SpiceGstVideoBuffer implementation ---------- */
+
+static void spice_gst_video_buffer_free(VideoBuffer *video_buffer)
+{
+ SpiceGstVideoBuffer *buffer = (SpiceGstVideoBuffer*)video_buffer;
+ if (buffer->gst_buffer) {
+ gst_buffer_unref(buffer->gst_buffer);
+ }
+ free(buffer);
+}
+
+static SpiceGstVideoBuffer* create_gst_video_buffer(void)
+{
+ SpiceGstVideoBuffer *buffer = spice_new0(SpiceGstVideoBuffer, 1);
+ buffer->base.free = spice_gst_video_buffer_free;
+ return buffer;
+}
+
+
/* ---------- Miscellaneous SpiceGstEncoder helpers ---------- */

static inline double get_mbps(uint64_t bit_rate)
@@ -461,29 +486,22 @@ static int push_raw_frame(SpiceGstEncoder *encoder, const SpiceBitmap *bitmap,

/* A helper for spice_gst_encoder_encode_frame() */
static int pull_compressed_buffer(SpiceGstEncoder *encoder,
- uint8_t **outbuf, size_t *outbuf_size,
- int *data_size)
+ VideoBuffer **outbuf)
{
- spice_return_val_if_fail(outbuf && outbuf_size, VIDEO_ENCODER_FRAME_UNSUPPORTED);
-
GstSample *sample = gst_app_sink_pull_sample(encoder->appsink);
if (sample) {
- GstMapInfo map;
- GstBuffer *buffer = gst_sample_get_buffer(sample);
- if (buffer && gst_buffer_map(buffer, &map, GST_MAP_READ)) {
- int size = gst_buffer_get_size(buffer);
- if (!*outbuf || *outbuf_size < size) {
- free(*outbuf);
- *outbuf = spice_malloc(size);
- *outbuf_size = size;
- }
- /* TODO Try to avoid this copy by changing the GstBuffer handling */
- memcpy(*outbuf, map.data, size);
- *data_size = size;
- gst_buffer_unmap(buffer, &map);
+ SpiceGstVideoBuffer *buffer = create_gst_video_buffer();
+ buffer->gst_buffer = gst_sample_get_buffer(sample);
+ if (buffer->gst_buffer &&
+ gst_buffer_map(buffer->gst_buffer, &buffer->map, GST_MAP_READ)) {
+ buffer->base.data = buffer->map.data;
+ buffer->base.size = gst_buffer_get_size(buffer->gst_buffer);
+ *outbuf = (VideoBuffer*)buffer;
+ gst_buffer_ref(buffer->gst_buffer);
gst_sample_unref(sample);
return VIDEO_ENCODER_FRAME_ENCODE_DONE;
}
+ buffer->base.free((VideoBuffer*)buffer);
gst_sample_unref(sample);
}
spice_debug("failed to pull the compressed buffer");
@@ -505,10 +523,11 @@ static int spice_gst_encoder_encode_frame(VideoEncoder *video_encoder,
const SpiceBitmap *bitmap,
int width, int height,
const SpiceRect *src, int top_down,
- uint8_t **outbuf, size_t *outbuf_size,
- int *data_size)
+ VideoBuffer **outbuf)
{
SpiceGstEncoder *encoder = (SpiceGstEncoder*)video_encoder;
+ g_return_val_if_fail(outbuf != NULL, VIDEO_ENCODER_FRAME_UNSUPPORTED);
+ *outbuf = NULL;

if (width != encoder->width || height != encoder->height ||
encoder->spice_format != bitmap->format) {
@@ -534,7 +553,7 @@ static int spice_gst_encoder_encode_frame(VideoEncoder *video_encoder,

int rc = push_raw_frame(encoder, bitmap, src, top_down);
if (rc == VIDEO_ENCODER_FRAME_ENCODE_DONE) {
- rc = pull_compressed_buffer(encoder, outbuf, outbuf_size, data_size);
+ rc = pull_compressed_buffer(encoder, outbuf);
if (rc != VIDEO_ENCODER_FRAME_ENCODE_DONE) {
/* The input buffer will be stuck in the pipeline, preventing
* later ones from being processed. So reset the pipeline.
diff --git a/server/mjpeg-encoder.c b/server/mjpeg-encoder.c
index 41237a7..cd86f0c 100644
--- a/server/mjpeg-encoder.c
+++ b/server/mjpeg-encoder.c
@@ -70,6 +70,9 @@ static const int mjpeg_quality_samples[MJPEG_QUALITY_SAMPLE_NUM] = {20, 30, 40,
*/
#define MJPEG_WARMUP_TIME (NSEC_PER_SEC * 3)

+/* The compressed buffer initial size. */
+#define MJPEG_INITIAL_BUFFER_SIZE (32 * 1024)
+
enum {
MJPEG_QUALITY_EVAL_TYPE_SET,
MJPEG_QUALITY_EVAL_TYPE_UPGRADE,
@@ -154,6 +157,11 @@ typedef struct MJpegEncoderRateControl {
uint64_t warmup_start_time;
} MJpegEncoderRateControl;

+typedef struct MJpegVideoBuffer {
+ VideoBuffer base;
+ size_t maxsize;
+} MJpegVideoBuffer;
+
typedef struct MJpegEncoder {
VideoEncoder base;
uint8_t *row;
@@ -180,6 +188,26 @@ static uint32_t get_min_required_playback_delay(uint64_t frame_enc_size,
uint64_t byte_rate,
uint32_t latency);

+static void mjpeg_video_buffer_free(VideoBuffer *video_buffer)
+{
+ MJpegVideoBuffer *buffer = (MJpegVideoBuffer*)video_buffer;
+ free(buffer->base.data);
+ free(buffer);
+}
+
+static MJpegVideoBuffer* create_mjpeg_video_buffer(void)
+{
+ MJpegVideoBuffer *buffer = spice_new0(MJpegVideoBuffer, 1);
+ buffer->base.free = mjpeg_video_buffer_free;
+ buffer->maxsize = MJPEG_INITIAL_BUFFER_SIZE;
+ buffer->base.data = malloc(buffer->maxsize);
+ if (!buffer->base.data) {
+ free(buffer);
+ buffer = NULL;
+ }
+ return buffer;
+}
+
static inline int rate_control_is_active(MJpegEncoder* encoder)
{
return encoder->cbs.get_roundtrip_ms != NULL;
@@ -283,24 +311,22 @@ static void term_mem_destination(j_compress_ptr cinfo)

/*
* Prepare for output to a memory buffer.
- * The caller may supply an own initial buffer with appropriate size.
- * Otherwise, or when the actual data output exceeds the given size,
- * the library adapts the buffer size as necessary.
- * The standard library functions malloc/free are used for allocating
- * larger memory, so the buffer is available to the application after
- * finishing compression, and then the application is responsible for
- * freeing the requested memory.
+ * The caller must supply its own initial buffer and size.
+ * When the actual data output exceeds the given size, the library
+ * will adapt the buffer size as necessary using the malloc()/free()
+ * functions. The buffer is available to the application after the
+ * compression and the application is then responsible for freeing it.
*/
-
static void
spice_jpeg_mem_dest(j_compress_ptr cinfo,
unsigned char ** outbuffer, size_t * outsize)
{
mem_destination_mgr *dest;
-#define OUTPUT_BUF_SIZE 4096 /* choose an efficiently fwrite'able size */

- if (outbuffer == NULL || outsize == NULL) /* sanity check */
+ if (outbuffer == NULL || *outbuffer == NULL ||
+ outsize == NULL || *outsize == 0) { /* sanity check */
ERREXIT(cinfo, JERR_BUFFER_SIZE);
+ }

/* The destination object is made permanent so that multiple JPEG images
* can be written to the same buffer without re-executing jpeg_mem_dest.
@@ -315,13 +341,6 @@ spice_jpeg_mem_dest(j_compress_ptr cinfo,
dest->pub.term_destination = term_mem_destination;
dest->outbuffer = outbuffer;
dest->outsize = outsize;
- if (*outbuffer == NULL || *outsize == 0) {
- /* Allocate initial buffer */
- *outbuffer = malloc(OUTPUT_BUF_SIZE);
- if (*outbuffer == NULL)
- ERREXIT1(cinfo, JERR_OUT_OF_MEMORY, 10);
- *outsize = OUTPUT_BUF_SIZE;
- }

dest->pub.next_output_byte = dest->buffer = *outbuffer;
dest->pub.free_in_buffer = dest->bufsize = *outsize;
@@ -707,7 +726,7 @@ static void mjpeg_encoder_adjust_fps(MJpegEncoder *encoder, uint64_t now)
static int mjpeg_encoder_start_frame(MJpegEncoder *encoder,
SpiceBitmapFmt format,
int width, int height,
- uint8_t **dest, size_t *dest_len,
+ MJpegVideoBuffer *buffer,
uint32_t frame_mm_time)
{
uint32_t quality;
@@ -789,7 +808,7 @@ static int mjpeg_encoder_start_frame(MJpegEncoder *encoder,
}
}

- spice_jpeg_mem_dest(&encoder->cinfo, dest, dest_len);
+ spice_jpeg_mem_dest(&encoder->cinfo, &buffer->base.data, &buffer->maxsize);

encoder->cinfo.image_width = width;
encoder->cinfo.image_height = height;
@@ -930,26 +949,30 @@ static int mjpeg_encoder_encode_frame(VideoEncoder *video_encoder,
const SpiceBitmap *bitmap,
int width, int height,
const SpiceRect *src, int top_down,
- uint8_t **outbuf, size_t *outbuf_size,
- int *data_size)
+ VideoBuffer **outbuf)
{
MJpegEncoder *encoder = (MJpegEncoder*)video_encoder;
+ MJpegVideoBuffer *buffer = create_mjpeg_video_buffer();
+ if (!buffer) {
+ return VIDEO_ENCODER_FRAME_UNSUPPORTED;
+ }

int ret = mjpeg_encoder_start_frame(encoder, bitmap->format,
width, height,
- outbuf, outbuf_size,
- frame_mm_time);
- if (ret != VIDEO_ENCODER_FRAME_ENCODE_DONE) {
- return ret;
+ buffer, frame_mm_time);
+ if (ret == VIDEO_ENCODER_FRAME_ENCODE_DONE) {
+ if (encode_frame(encoder, src, bitmap, top_down)) {
+ buffer->base.size = mjpeg_encoder_end_frame(encoder);
+ *outbuf = (VideoBuffer*)buffer;
+ } else {
+ ret = VIDEO_ENCODER_FRAME_UNSUPPORTED;
+ }
}

- if (!encode_frame(encoder, src, bitmap, top_down)) {
- return VIDEO_ENCODER_FRAME_UNSUPPORTED;
+ if (ret != VIDEO_ENCODER_FRAME_ENCODE_DONE) {
+ buffer->base.free((VideoBuffer*)buffer);
}
-
- *data_size = mjpeg_encoder_end_frame(encoder);
-
- return VIDEO_ENCODER_FRAME_ENCODE_DONE;
+ return ret;
}


diff --git a/server/video-encoder.h b/server/video-encoder.h
index 8f3807b..f94fd69 100644
--- a/server/video-encoder.h
+++ b/server/video-encoder.h
@@ -21,6 +21,22 @@
#ifndef _H_VIDEO_ENCODER
#define _H_VIDEO_ENCODER

+/* A structure containing the data for a compressed frame. See encode_frame(). */
+typedef struct VideoBuffer VideoBuffer;
+struct VideoBuffer {
+ /* A pointer to the compressed frame data. */
+ uint8_t *data;
+
+ /* The size of the compressed frame in bytes. */
+ uint32_t size;
+
+ /* Releases the video buffer resources and deallocates it.
+ *
+ * @buffer: The video buffer.
+ */
+ void (*free)(VideoBuffer *buffer);
+};
+
enum {
VIDEO_ENCODER_FRAME_UNSUPPORTED = -1,
VIDEO_ENCODER_FRAME_DROP,
@@ -45,11 +61,9 @@ struct VideoEncoder {
* @bitmap: The Spice screen.
* @src: A rectangle specifying the area occupied by the video.
* @top_down: If true the first video line is specified by src.top.
- * @outbuf: The buffer for the compressed frame. This must either
- * be NULL or point to a buffer allocated by malloc
- * since it may be reallocated, if its size is too small.
- * @outbuf_size: The size of the outbuf buffer.
- * @data_size: The size of the compressed frame.
+ * @outbuf: A pointer to a VideoBuffer structure containing the
+ * compressed frame if successful. Call the buffer's
+ * free() method as soon as it is no longer needed.
* @return:
* VIDEO_ENCODER_FRAME_ENCODE_DONE if successful.
* VIDEO_ENCODER_FRAME_UNSUPPORTED if the frame cannot be encoded.
@@ -59,7 +73,7 @@ struct VideoEncoder {
int (*encode_frame)(VideoEncoder *encoder, uint32_t frame_mm_time,
const SpiceBitmap *bitmap, int width, int height,
const SpiceRect *src, int top_down,
- uint8_t **outbuf, size_t *outbuf_size, int *data_size);
+ VideoBuffer** outbuf);

/*
* Bit rate control methods.
--
2.8.0.rc3
Francois Gouget
2016-04-05 15:15:48 UTC
Permalink
Note that we can only avoid copies for the first 1 Mpixels or so.
That's because Spice splits larger frames into more chunks than we can
fit GstMemory fragments in a GStreamer buffer. So if there are more
pixels we will avoid copies for the first 3840 KB and copy the rest.
Furthermore, while in practice the GStreamer encoder will only modify
the RedDrawable refcount during the encode_frame(), in theory the
refcount could be decremented from the GStreamer thread after
encode_frame() returns.

Signed-off-by: Francois Gouget <***@codeweavers.com>
---
server/dcc-send.c | 13 +--
server/gstreamer-encoder.c | 199 +++++++++++++++++++++++++++++++++++++++++----
server/mjpeg-encoder.c | 5 +-
server/stream.c | 16 +++-
server/video-encoder.h | 25 +++++-
5 files changed, 231 insertions(+), 27 deletions(-)

diff --git a/server/dcc-send.c b/server/dcc-send.c
index 3be443e..322f6eb 100644
--- a/server/dcc-send.c
+++ b/server/dcc-send.c
@@ -1656,12 +1656,12 @@ static void red_release_video_encoder_buffer(uint8_t *data, void *opaque)
}

static int red_marshall_stream_data(RedChannelClient *rcc,
- SpiceMarshaller *base_marshaller, Drawable *drawable)
+ SpiceMarshaller *base_marshaller,
+ Drawable *drawable)
{
DisplayChannelClient *dcc = RCC_TO_DCC(rcc);
DisplayChannel *display = DCC_TO_DC(dcc);
Stream *stream = drawable->stream;
- SpiceImage *image;
uint32_t frame_mm_time;
int width, height;
int ret;
@@ -1670,10 +1670,10 @@ static int red_marshall_stream_data(RedChannelClient *rcc,
spice_assert(drawable->sized_stream);
stream = drawable->sized_stream;
}
- spice_assert(drawable->red_drawable->type == QXL_DRAW_COPY);
-
- image = drawable->red_drawable->u.copy.src_bitmap;
+ RedDrawable *red_drawable = drawable->red_drawable;
+ spice_assert(red_drawable->type == QXL_DRAW_COPY);

+ SpiceImage *image = red_drawable->u.copy.src_bitmap;
if (image->descriptor.type != SPICE_IMAGE_TYPE_BITMAP) {
return FALSE;
}
@@ -1715,7 +1715,8 @@ static int red_marshall_stream_data(RedChannelClient *rcc,
frame_mm_time,
&image->u.bitmap, width, height,
&drawable->red_drawable->u.copy.src_area,
- stream->top_down, &outbuf);
+ stream->top_down, red_drawable,
+ &outbuf);
switch (ret) {
case VIDEO_ENCODER_FRAME_DROP:
spice_assert(dcc->use_video_encoder_rate_control);
diff --git a/server/gstreamer-encoder.c b/server/gstreamer-encoder.c
index fc04f78..b754676 100644
--- a/server/gstreamer-encoder.c
+++ b/server/gstreamer-encoder.c
@@ -30,6 +30,8 @@

#define SPICE_GST_DEFAULT_FPS 30

+#define DO_ZERO_COPY
+

typedef struct {
SpiceBitmapFmt spice_format;
@@ -46,6 +48,14 @@ typedef struct SpiceGstVideoBuffer {
typedef struct SpiceGstEncoder {
VideoEncoder base;

+ /* Callbacks to adjust the refcount of the bitmap being encoded. */
+ bitmap_ref_t bitmap_ref;
+ bitmap_unref_t bitmap_unref;
+
+#ifdef DO_ZERO_COPY
+ GAsyncQueue *unused_bitmap_opaques;
+#endif
+
/* Rate control callbacks */
VideoEncoderRateControlCbs cbs;

@@ -404,12 +414,109 @@ static inline int line_copy(SpiceGstEncoder *encoder, const SpiceBitmap *bitmap,
return TRUE;
}

+#ifdef DO_ZERO_COPY
+typedef struct {
+ gint refs;
+ SpiceGstEncoder *encoder;
+ gpointer opaque;
+} BitmapWrapper;
+
+static void clear_zero_copy_queue(SpiceGstEncoder *encoder, gboolean unref_queue)
+{
+ gpointer bitmap_opaque;
+ while ((bitmap_opaque = g_async_queue_try_pop(encoder->unused_bitmap_opaques))) {
+ encoder->bitmap_unref(bitmap_opaque);
+ }
+ if (unref_queue) {
+ g_async_queue_unref(encoder->unused_bitmap_opaques);
+ }
+}
+
+static BitmapWrapper *bitmap_wrapper_new(SpiceGstEncoder *encoder, gpointer bitmap_opaque)
+{
+ BitmapWrapper *wrapper = spice_new(BitmapWrapper, 1);
+ wrapper->refs = 1;
+ wrapper->encoder = encoder;
+ wrapper->opaque = bitmap_opaque;
+ encoder->bitmap_ref(bitmap_opaque);
+ return wrapper;
+}
+
+static void bitmap_wrapper_unref(gpointer data)
+{
+ BitmapWrapper *wrapper = data;
+ if (g_atomic_int_dec_and_test(&wrapper->refs)) {
+ g_async_queue_push(wrapper->encoder->unused_bitmap_opaques, wrapper->opaque);
+ free(wrapper);
+ }
+}
+
+
+/* A helper for push_raw_frame() */
+static inline int zero_copy(SpiceGstEncoder *encoder,
+ const SpiceBitmap *bitmap, gpointer bitmap_opaque,
+ GstBuffer *buffer, uint32_t *chunk_index,
+ uint32_t *chunk_offset, uint32_t *len)
+{
+ const SpiceChunks *chunks = bitmap->data;
+ while (*chunk_index < chunks->num_chunks &&
+ *chunk_offset >= chunks->chunk[*chunk_index].len) {
+ if (is_chunk_padded(bitmap, *chunk_index)) {
+ return FALSE;
+ }
+ *chunk_offset -= chunks->chunk[*chunk_index].len;
+ (*chunk_index)++;
+ }
+
+ int max_mem = gst_buffer_get_max_memory();
+ if (chunks->num_chunks - *chunk_index > max_mem) {
+ /* There are more chunks than we can fit memory objects in a
+ * buffer. This will cause the buffer to merge memory objects to
+ * fit the extra chunks, which means doing wasteful memory copies.
+ * So use the zero-copy approach for the first max_mem-1 chunks, and
+ * let push_raw_frame() deal with the rest.
+ */
+ max_mem = *chunk_index + max_mem - 1;
+ } else {
+ max_mem = chunks->num_chunks;
+ }
+
+ BitmapWrapper *wrapper = NULL;
+ while (*len && *chunk_index < max_mem) {
+ if (is_chunk_padded(bitmap, *chunk_index)) {
+ return FALSE;
+ }
+ if (wrapper) {
+ wrapper->refs++;
+ } else {
+ wrapper = bitmap_wrapper_new(encoder, bitmap_opaque);
+ }
+ uint32_t thislen = MIN(chunks->chunk[*chunk_index].len - *chunk_offset, *len);
+ GstMemory *mem = gst_memory_new_wrapped(GST_MEMORY_FLAG_READONLY,
+ chunks->chunk[*chunk_index].data,
+ chunks->chunk[*chunk_index].len,
+ *chunk_offset, thislen,
+ wrapper, bitmap_wrapper_unref);
+ gst_buffer_append_memory(buffer, mem);
+ *len -= thislen;
+ *chunk_offset = 0;
+ (*chunk_index)++;
+ }
+ return TRUE;
+}
+#else
+static void clear_zero_copy_queue(SpiceGstEncoder *encoder, gboolean unref_queue)
+{
+ /* Nothing to do */
+}
+#endif
+
/* A helper for push_raw_frame() */
static inline int chunk_copy(SpiceGstEncoder *encoder, const SpiceBitmap *bitmap,
- uint32_t chunk_offset, uint32_t len, uint8_t *dst)
+ uint32_t chunk_index, uint32_t chunk_offset,
+ uint32_t len, uint8_t *dst)
{
SpiceChunks *chunks = bitmap->data;
- uint32_t chunk_index = 0;
/* Skip chunks until we find the start of the frame */
while (chunk_index < chunks->num_chunks &&
chunk_offset >= chunks->chunk[chunk_index].len) {
@@ -437,17 +544,35 @@ static inline int chunk_copy(SpiceGstEncoder *encoder, const SpiceBitmap *bitmap
return TRUE;
}

+/* A helper for push_raw_frame() */
+static uint8_t *allocate_and_map_memory(gsize size, GstMapInfo *map, GstBuffer *buffer)
+{
+ GstMemory *mem = gst_allocator_alloc(NULL, size, NULL);
+ if (!mem) {
+ gst_buffer_unref(buffer);
+ return NULL;
+ }
+ if (!gst_memory_map(mem, map, GST_MAP_WRITE))
+ {
+ gst_memory_unref(mem);
+ gst_buffer_unref(buffer);
+ return NULL;
+ }
+ return map->data;
+}
+
/* A helper for spice_gst_encoder_encode_frame() */
-static int push_raw_frame(SpiceGstEncoder *encoder, const SpiceBitmap *bitmap,
- const SpiceRect *src, int top_down)
+static int push_raw_frame(SpiceGstEncoder *encoder,
+ const SpiceBitmap *bitmap,
+ const SpiceRect *src, int top_down,
+ gpointer bitmap_opaque)
{
uint32_t height = src->bottom - src->top;
uint32_t stream_stride = (src->right - src->left) * encoder->format->bpp / 8;
uint32_t len = stream_stride * height;
- GstBuffer *buffer = gst_buffer_new_and_alloc(len);
- GstMapInfo map;
- gst_buffer_map(buffer, &map, GST_MAP_WRITE);
- uint8_t *dst = map.data;
+ GstBuffer *buffer = gst_buffer_new();
+ /* TODO Use GST_MAP_INFO_INIT once GStreamer 1.4.5 is no longer relevant */
+ GstMapInfo map = { .memory = NULL };

/* Note that we should not reorder the lines, even if top_down is false.
* It just changes the number of lines to skip at the start of the bitmap.
@@ -459,20 +584,50 @@ static int push_raw_frame(SpiceGstEncoder *encoder, const SpiceBitmap *bitmap,
/* We have to do a line-by-line copy because for each we have to
* leave out pixels on the left or right.
*/
+ uint8_t *dst = allocate_and_map_memory(len, &map, buffer);
+ if (!dst) {
+ return VIDEO_ENCODER_FRAME_UNSUPPORTED;
+ }
+
chunk_offset += src->left * encoder->format->bpp / 8;
if (!line_copy(encoder, bitmap, chunk_offset, stream_stride, height, dst)) {
- gst_buffer_unmap(buffer, &map);
+ gst_memory_unmap(map.memory, &map);
+ gst_memory_unref(map.memory);
gst_buffer_unref(buffer);
return VIDEO_ENCODER_FRAME_UNSUPPORTED;
}
} else {
- if (!chunk_copy(encoder, bitmap, chunk_offset, len, dst)) {
- gst_buffer_unmap(buffer, &map);
+ uint32_t chunk_index = 0;
+ /* We can copy the bitmap chunk by chunk */
+#ifdef DO_ZERO_COPY
+ if (!zero_copy(encoder, bitmap, bitmap_opaque, buffer, &chunk_index, &chunk_offset, &len)) {
gst_buffer_unref(buffer);
return VIDEO_ENCODER_FRAME_UNSUPPORTED;
}
+ /* len now contains the remaining number of bytes to copy.
+ * However we must avoid any write to the GstBuffer object as it
+ * would cause a copy of the read-only memory objects we just added.
+ * Fortunately we can append extra writable memory objects instead.
+ */
+#endif
+
+ if (len) {
+ uint8_t *dst = allocate_and_map_memory(len, &map, buffer);
+ if (!dst) {
+ return VIDEO_ENCODER_FRAME_UNSUPPORTED;
+ }
+ if (!chunk_copy(encoder, bitmap, chunk_index, chunk_offset, len, dst)) {
+ gst_memory_unmap(map.memory, &map);
+ gst_memory_unref(map.memory);
+ gst_buffer_unref(buffer);
+ return VIDEO_ENCODER_FRAME_UNSUPPORTED;
+ }
+ }
+ }
+ if (map.memory) {
+ gst_memory_unmap(map.memory, &map);
+ gst_buffer_append_memory(buffer, map.memory);
}
- gst_buffer_unmap(buffer, &map);
GST_BUFFER_OFFSET(buffer) = encoder->frame++;

GstFlowReturn ret = gst_app_src_push_buffer(encoder->appsrc, buffer);
@@ -515,6 +670,8 @@ static void spice_gst_encoder_destroy(VideoEncoder *video_encoder)
{
SpiceGstEncoder *encoder = (SpiceGstEncoder*)video_encoder;
free_pipeline(encoder);
+ /* Unref any lingering bitmap opaque structures from past frames */
+ clear_zero_copy_queue(encoder, TRUE);
free(encoder);
}

@@ -523,12 +680,16 @@ static int spice_gst_encoder_encode_frame(VideoEncoder *video_encoder,
const SpiceBitmap *bitmap,
int width, int height,
const SpiceRect *src, int top_down,
+ gpointer bitmap_opaque,
VideoBuffer **outbuf)
{
SpiceGstEncoder *encoder = (SpiceGstEncoder*)video_encoder;
g_return_val_if_fail(outbuf != NULL, VIDEO_ENCODER_FRAME_UNSUPPORTED);
*outbuf = NULL;

+ /* Unref the last frame's bitmap_opaque structures if any */
+ clear_zero_copy_queue(encoder, FALSE);
+
if (width != encoder->width || height != encoder->height ||
encoder->spice_format != bitmap->format) {
spice_debug("video format change: width %d -> %d, height %d -> %d, format %d -> %d",
@@ -551,7 +712,7 @@ static int spice_gst_encoder_encode_frame(VideoEncoder *video_encoder,
return VIDEO_ENCODER_FRAME_UNSUPPORTED;
}

- int rc = push_raw_frame(encoder, bitmap, src, top_down);
+ int rc = push_raw_frame(encoder, bitmap, src, top_down, bitmap_opaque);
if (rc == VIDEO_ENCODER_FRAME_ENCODE_DONE) {
rc = pull_compressed_buffer(encoder, outbuf);
if (rc != VIDEO_ENCODER_FRAME_ENCODE_DONE) {
@@ -561,6 +722,9 @@ static int spice_gst_encoder_encode_frame(VideoEncoder *video_encoder,
free_pipeline(encoder);
}
}
+
+ /* Unref the last frame's bitmap_opaque structures if any */
+ clear_zero_copy_queue(encoder, FALSE);
return rc;
}

@@ -608,7 +772,9 @@ static void spice_gst_encoder_get_stats(VideoEncoder *video_encoder,

VideoEncoder *gstreamer_encoder_new(SpiceVideoCodecType codec_type,
uint64_t starting_bit_rate,
- VideoEncoderRateControlCbs *cbs)
+ VideoEncoderRateControlCbs *cbs,
+ bitmap_ref_t bitmap_ref,
+ bitmap_unref_t bitmap_unref)
{
spice_return_val_if_fail(codec_type == SPICE_VIDEO_CODEC_TYPE_MJPEG ||
codec_type == SPICE_VIDEO_CODEC_TYPE_VP8, NULL);
@@ -628,11 +794,16 @@ VideoEncoder *gstreamer_encoder_new(SpiceVideoCodecType codec_type,
encoder->base.get_bit_rate = spice_gst_encoder_get_bit_rate;
encoder->base.get_stats = spice_gst_encoder_get_stats;
encoder->base.codec_type = codec_type;
+#ifdef DO_ZERO_COPY
+ encoder->unused_bitmap_opaques = g_async_queue_new();
+#endif

if (cbs) {
encoder->cbs = *cbs;
}
encoder->starting_bit_rate = starting_bit_rate;
+ encoder->bitmap_ref = bitmap_ref;
+ encoder->bitmap_unref = bitmap_unref;

/* All the other fields are initialized to zero by spice_new0(). */

diff --git a/server/mjpeg-encoder.c b/server/mjpeg-encoder.c
index cd86f0c..fc0fc0c 100644
--- a/server/mjpeg-encoder.c
+++ b/server/mjpeg-encoder.c
@@ -949,6 +949,7 @@ static int mjpeg_encoder_encode_frame(VideoEncoder *video_encoder,
const SpiceBitmap *bitmap,
int width, int height,
const SpiceRect *src, int top_down,
+ gpointer bitmap_opaque,
VideoBuffer **outbuf)
{
MJpegEncoder *encoder = (MJpegEncoder*)video_encoder;
@@ -1369,7 +1370,9 @@ static void mjpeg_encoder_get_stats(VideoEncoder *video_encoder,

VideoEncoder *mjpeg_encoder_new(SpiceVideoCodecType codec_type,
uint64_t starting_bit_rate,
- VideoEncoderRateControlCbs *cbs)
+ VideoEncoderRateControlCbs *cbs,
+ bitmap_ref_t bitmap_ref,
+ bitmap_unref_t bitmap_unref)
{
MJpegEncoder *encoder = spice_new0(MJpegEncoder, 1);

diff --git a/server/stream.c b/server/stream.c
index 0aa10c3..b134625 100644
--- a/server/stream.c
+++ b/server/stream.c
@@ -696,6 +696,18 @@ static void update_client_playback_delay(void *opaque, uint32_t delay_ms)
agent->dcc->streams_max_latency);
}

+void bitmap_ref(gpointer data)
+{
+ RedDrawable *red_drawable = (RedDrawable*)data;
+ red_drawable_ref(red_drawable);
+}
+
+void bitmap_unref(gpointer data)
+{
+ RedDrawable *red_drawable = (RedDrawable*)data;
+ red_drawable_unref(red_drawable);
+}
+
/* A helper for dcc_create_stream(). */
static VideoEncoder* dcc_create_video_encoder(DisplayChannelClient *dcc,
uint64_t starting_bit_rate,
@@ -720,7 +732,7 @@ static VideoEncoder* dcc_create_video_encoder(DisplayChannelClient *dcc,
continue;
}

- VideoEncoder* video_encoder = video_codec->create(video_codec->type, starting_bit_rate, cbs);
+ VideoEncoder* video_encoder = video_codec->create(video_codec->type, starting_bit_rate, cbs, bitmap_ref, bitmap_unref);
if (video_encoder) {
return video_encoder;
}
@@ -728,7 +740,7 @@ static VideoEncoder* dcc_create_video_encoder(DisplayChannelClient *dcc,

/* Try to use the builtin MJPEG video encoder as a fallback */
if (!client_has_multi_codec || red_channel_client_test_remote_cap(rcc, SPICE_DISPLAY_CAP_CODEC_MJPEG)) {
- return mjpeg_encoder_new(SPICE_VIDEO_CODEC_TYPE_MJPEG, starting_bit_rate, cbs);
+ return mjpeg_encoder_new(SPICE_VIDEO_CODEC_TYPE_MJPEG, starting_bit_rate, cbs, bitmap_ref, bitmap_unref);
}

return NULL;
diff --git a/server/video-encoder.h b/server/video-encoder.h
index f94fd69..41c7f17 100644
--- a/server/video-encoder.h
+++ b/server/video-encoder.h
@@ -61,6 +61,8 @@ struct VideoEncoder {
* @bitmap: The Spice screen.
* @src: A rectangle specifying the area occupied by the video.
* @top_down: If true the first video line is specified by src.top.
+ * @bitmap_opaque: The parameter for the bitmap_ref() and bitmap_unref()
+ * callbacks.
* @outbuf: A pointer to a VideoBuffer structure containing the
* compressed frame if successful. Call the buffer's
* free() method as soon as it is no longer needed.
@@ -73,7 +75,7 @@ struct VideoEncoder {
int (*encode_frame)(VideoEncoder *encoder, uint32_t frame_mm_time,
const SpiceBitmap *bitmap, int width, int height,
const SpiceRect *src, int top_down,
- VideoBuffer** outbuf);
+ gpointer bitmap_opaque, VideoBuffer** outbuf);

/*
* Bit rate control methods.
@@ -162,6 +164,8 @@ typedef struct VideoEncoderRateControlCbs {
void (*update_client_playback_delay)(void *opaque, uint32_t delay_ms);
} VideoEncoderRateControlCbs;

+typedef void (*bitmap_ref_t)(gpointer data);
+typedef void (*bitmap_unref_t)(gpointer data);

/* Instantiates the video encoder.
*
@@ -169,22 +173,35 @@ typedef struct VideoEncoderRateControlCbs {
* @starting_bit_rate: An initial estimate of the available stream bit rate
* or zero if the client does not support rate control.
* @cbs: A set of callback methods to be used for rate control.
+ * @bitmap_ref: A callback that the encoder can use to increase the
+ * bitmap refcount.
+ * This must be called from the main context.
+ * @bitmap_unref: A callback that the encoder can use to decrease the
+ * bitmap refcount.
+ * This must be called from the main context.
* @return: A pointer to a structure implementing the VideoEncoder
* methods.
*/
typedef VideoEncoder* (*new_video_encoder_t)(SpiceVideoCodecType codec_type,
uint64_t starting_bit_rate,
- VideoEncoderRateControlCbs *cbs);
+ VideoEncoderRateControlCbs *cbs,
+ bitmap_ref_t bitmap_ref,
+ bitmap_unref_t bitmap_unref);

VideoEncoder* mjpeg_encoder_new(SpiceVideoCodecType codec_type,
uint64_t starting_bit_rate,
- VideoEncoderRateControlCbs *cbs);
+ VideoEncoderRateControlCbs *cbs,
+ bitmap_ref_t bitmap_ref,
+ bitmap_unref_t bitmap_unref);
#ifdef HAVE_GSTREAMER_1_0
VideoEncoder* gstreamer_encoder_new(SpiceVideoCodecType codec_type,
uint64_t starting_bit_rate,
- VideoEncoderRateControlCbs *cbs);
+ VideoEncoderRateControlCbs *cbs,
+ bitmap_ref_t bitmap_ref,
+ bitmap_unref_t bitmap_unref);
#endif

+
typedef struct RedVideoCodec {
new_video_encoder_t create;
SpiceVideoCodecType type;
--
2.8.0.rc3
Francois Gouget
2016-04-05 15:15:53 UTC
Permalink
Signed-off-by: Francois Gouget <***@codeweavers.com>
---
configure.ac | 1 +
server/gstreamer-encoder.c | 17 ++++++++++++++++-
server/reds.c | 4 +++-
3 files changed, 20 insertions(+), 2 deletions(-)

diff --git a/configure.ac b/configure.ac
index 6093431..1e98523 100644
--- a/configure.ac
+++ b/configure.ac
@@ -80,6 +80,7 @@ if test "x$enable_gstreamer" != "xno"; then
SPICE_CHECK_GSTREAMER_ELEMENTS($GST_INSPECT_1_0, [gst-plugins-base 1.0], [appsrc videoconvert appsink])
SPICE_CHECK_GSTREAMER_ELEMENTS($GST_INSPECT_1_0, [gstreamer-libav 1.0], [avenc_mjpeg])
SPICE_CHECK_GSTREAMER_ELEMENTS($GST_INSPECT_1_0, [gst-plugins-good 1.0], [vp8enc])
+ SPICE_CHECK_GSTREAMER_ELEMENTS($GST_INSPECT_1_0, [gst-plugins-ugly 1.0], [x264enc])
],
[if test "x$enable_gstreamer" = "xyes"; then
AC_MSG_ERROR([GStreamer 1.0 support requested but not found. You may set GSTREAMER_1_0_CFLAGS and GSTREAMER_1_0_LIBS to avoid the need to call pkg-config.])
diff --git a/server/gstreamer-encoder.c b/server/gstreamer-encoder.c
index b754676..82a29aa 100644
--- a/server/gstreamer-encoder.c
+++ b/server/gstreamer-encoder.c
@@ -275,6 +275,15 @@ static gboolean create_pipeline(SpiceGstEncoder *encoder)
gstenc = g_strdup_printf("vp8enc end-usage=cbr min-quantizer=10 resize-allowed=true error-resilient=true lag-in-frames=0 deadline=1 cpu-used=4 threads=%d token-partitions=%d", threads, parts);
break;
}
+ case SPICE_VIDEO_CODEC_TYPE_H264:
+ /* - Set tune and sliced-threads to ensure a zero-frame latency
+ * - qp-min ensures the bitrate does not get needlessly high.
+ * - Set speed-preset to get realtime speed.
+ * - Set intra-refresh to get more uniform compressed frame sizes,
+ * thus helping with streaming.
+ */
+ gstenc = g_strdup("x264enc byte-stream=true aud=true qp-min=15 tune=4 sliced-threads=true speed-preset=ultrafast intra-refresh=true");
+ break;
default:
/* gstreamer_encoder_new() should have rejected this codec type */
spice_warning("unsupported codec type %d", encoder->base.codec_type);
@@ -327,6 +336,11 @@ static gboolean configure_pipeline(SpiceGstEncoder *encoder,
"target-bitrate", (gint)encoder->bit_rate,
NULL);
break;
+ case SPICE_VIDEO_CODEC_TYPE_H264:
+ g_object_set(G_OBJECT(encoder->gstenc),
+ "bitrate", encoder->bit_rate / 1024,
+ NULL);
+ break;
default:
/* gstreamer_encoder_new() should have rejected this codec type */
spice_warning("unsupported codec type %d", encoder->base.codec_type);
@@ -777,7 +791,8 @@ VideoEncoder *gstreamer_encoder_new(SpiceVideoCodecType codec_type,
bitmap_unref_t bitmap_unref)
{
spice_return_val_if_fail(codec_type == SPICE_VIDEO_CODEC_TYPE_MJPEG ||
- codec_type == SPICE_VIDEO_CODEC_TYPE_VP8, NULL);
+ codec_type == SPICE_VIDEO_CODEC_TYPE_VP8 ||
+ codec_type == SPICE_VIDEO_CODEC_TYPE_H264, NULL);

GError *err = NULL;
if (!gst_init_check(NULL, NULL, &err)) {
diff --git a/server/reds.c b/server/reds.c
index cffed46..03eb678 100644
--- a/server/reds.c
+++ b/server/reds.c
@@ -3482,7 +3482,7 @@ err:
}

static const char default_renderer[] = "sw";
-static const char default_video_codecs[] = "spice:mjpeg;gstreamer:mjpeg;gstreamer:vp8";
+static const char default_video_codecs[] = "spice:mjpeg;gstreamer:mjpeg;gstreamer:h264;gstreamer:vp8";

/* new interface */
SPICE_GNUC_VISIBLE SpiceServer *spice_server_new(void)
@@ -3568,12 +3568,14 @@ static new_video_encoder_t video_encoder_procs[] = {
static const EnumNames video_codec_names[] = {
{SPICE_VIDEO_CODEC_TYPE_MJPEG, "mjpeg"},
{SPICE_VIDEO_CODEC_TYPE_VP8, "vp8"},
+ {SPICE_VIDEO_CODEC_TYPE_H264, "h264"},
{0, NULL},
};

static int video_codec_caps[] = {
SPICE_DISPLAY_CAP_CODEC_MJPEG,
SPICE_DISPLAY_CAP_CODEC_VP8,
+ SPICE_DISPLAY_CAP_CODEC_H264,
};
--
2.8.0.rc3
Francois Gouget
2016-04-05 15:16:01 UTC
Permalink
The GStreamer codecs don't follow the specified bit rate very closely:
they can decide to exceed it for ten seconds or more if they consider
the scene deserves it. Such long bursts are enough to cause network
congestion, resulting in many lost frames which cause significant
display corruption.
So the GStreamer video encoder now uses a short 300ms virtual buffer
to shape the compressed video output and ensure we don't exceed the
target bit rate for any significant length of time.
It could instead rely on the network feedback (when available) to lower
the bit rate. However frequent GStreamer bit rate changes lower the
overall compression level and also result in a lower average bit rate,
both of which result in lower video quality.
The GStreamer video encoder also keeps track of the encoded frame size
so it can gather statistics and call update_client_playback_delay()
with accurate information and also annotate the client report debug
traces with the corresponding bit rate information.

Signed-off-by: Francois Gouget <***@codeweavers.com>
---
server/gstreamer-encoder.c | 291 +++++++++++++++++++++++++++++++++++++++++++--
1 file changed, 283 insertions(+), 8 deletions(-)

diff --git a/server/gstreamer-encoder.c b/server/gstreamer-encoder.c
index 82a29aa..e6ff3fa 100644
--- a/server/gstreamer-encoder.c
+++ b/server/gstreamer-encoder.c
@@ -26,6 +26,7 @@

#include "red-common.h"
#include "video-encoder.h"
+#include "utils.h"


#define SPICE_GST_DEFAULT_FPS 30
@@ -45,6 +46,11 @@ typedef struct SpiceGstVideoBuffer {
GstMapInfo map;
} SpiceGstVideoBuffer;

+typedef struct {
+ uint32_t mm_time;
+ uint32_t size;
+} SpiceGstFrameInformation;
+
typedef struct SpiceGstEncoder {
VideoEncoder base;

@@ -85,6 +91,45 @@ typedef struct SpiceGstEncoder {
/* The frame counter for GStreamer buffers */
uint32_t frame;

+
+ /* ---------- Encoded frame statistics ---------- */
+
+ /* Should be >= than FRAME_STATISTICS_COUNT. This is also used to annotate
+ * the client report debug traces with bit rate information.
+ */
+# define SPICE_GST_HISTORY_SIZE 60
+
+ /* A circular buffer containing the past encoded frames information. */
+ SpiceGstFrameInformation history[SPICE_GST_HISTORY_SIZE];
+
+ /* The indices of the oldest and newest frames in the history buffer. */
+ uint32_t history_first;
+ uint32_t history_last;
+
+ /* How many frames to take into account when computing the effective
+ * bit rate, average frame size, etc. This should be large enough so the
+ * I and P frames average out, and short enough for it to reflect the
+ * current situation.
+ */
+# define SPICE_GST_FRAME_STATISTICS_COUNT 21
+
+ /* The index of the oldest frame taken into account for the statistics. */
+ uint32_t stat_first;
+
+ /* Used to compute the average frame size. */
+ uint64_t stat_size_sum;
+
+ /* Tracks the maximum frame size. */
+ uint32_t stat_size_max;
+
+
+ /* ---------- Encoder bit rate control ----------
+ *
+ * GStreamer encoders don't follow the specified bit rate very
+ * closely. These fields are used to ensure we don't exceed the desired
+ * stream bit rate, regardless of the GStreamer encoder's output.
+ */
+
/* The bit rate target for the outgoing network stream. (bits per second) */
uint64_t bit_rate;

@@ -93,6 +138,27 @@ typedef struct SpiceGstEncoder {

/* The default bit rate */
# define SPICE_GST_DEFAULT_BITRATE (8 * 1024 * 1024)
+
+ /* The bit rate control is performed using a virtual buffer to allow short
+ * term variations: bursts are allowed until the virtual buffer is full.
+ * Then frames are dropped to limit the bit rate. VBUFFER_SIZE defines the
+ * size of the virtual buffer in milliseconds worth of data.
+ */
+# define SPICE_GST_VBUFFER_SIZE 300
+
+ int32_t vbuffer_size;
+ int32_t vbuffer_free;
+
+ /* When dropping frames, this is set to the minimum mm_time of the next
+ * frame to encode. Otherwise set to zero.
+ */
+ uint32_t next_frame_mm_time;
+
+ /* Defines the minimum allowed fps. */
+# define SPICE_GST_MAX_PERIOD (NSEC_PER_SEC / 3)
+
+ /* How big of a margin to take to cover for latency jitter. */
+# define SPICE_GST_LATENCY_MARGIN 0.1
} SpiceGstEncoder;


@@ -131,6 +197,18 @@ static uint32_t get_source_fps(SpiceGstEncoder *encoder)
encoder->cbs.get_source_fps(encoder->cbs.opaque) : SPICE_GST_DEFAULT_FPS;
}

+static uint32_t get_network_latency(SpiceGstEncoder *encoder)
+{
+ /* Assume that the network latency is symmetric */
+ return encoder->cbs.get_roundtrip_ms ?
+ encoder->cbs.get_roundtrip_ms(encoder->cbs.opaque) / 2 : 0;
+}
+
+static inline int rate_control_is_active(SpiceGstEncoder* encoder)
+{
+ return encoder->cbs.get_roundtrip_ms != NULL;
+}
+
static inline int is_pipeline_configured(SpiceGstEncoder *encoder)
{
return encoder->src_caps != NULL;
@@ -152,6 +230,180 @@ static void free_pipeline(SpiceGstEncoder *encoder)
}
}

+
+/* ---------- Encoded frame statistics ---------- */
+
+static inline uint32_t get_last_frame_mm_time(SpiceGstEncoder *encoder)
+{
+ return encoder->history[encoder->history_last].mm_time;
+}
+
+/* Returns the current bit rate based on the last
+ * SPICE_GST_FRAME_STATISTICS_COUNT frames.
+ */
+static uint64_t get_effective_bit_rate(SpiceGstEncoder *encoder)
+{
+ uint32_t next_mm_time = encoder->next_frame_mm_time ?
+ encoder->next_frame_mm_time :
+ get_last_frame_mm_time(encoder) +
+ MSEC_PER_SEC / get_source_fps(encoder);
+ uint32_t elapsed = next_mm_time - encoder->history[encoder->stat_first].mm_time;
+ return elapsed ? encoder->stat_size_sum * 8 * MSEC_PER_SEC / elapsed : 0;
+}
+
+static uint64_t get_average_frame_size(SpiceGstEncoder *encoder)
+{
+ uint32_t count = encoder->history_last +
+ (encoder->history_last < encoder->stat_first ? SPICE_GST_HISTORY_SIZE : 0) -
+ encoder->stat_first + 1;
+ return encoder->stat_size_sum / count;
+}
+
+static uint32_t get_maximum_frame_size(SpiceGstEncoder *encoder)
+{
+ if (encoder->stat_size_max == 0) {
+ uint32_t index = encoder->history_last;
+ while (1) {
+ encoder->stat_size_max = MAX(encoder->stat_size_max,
+ encoder->history[index].size);
+ if (index == encoder->stat_first) {
+ break;
+ }
+ index = (index ? index : SPICE_GST_HISTORY_SIZE) - 1;
+ }
+ }
+ return encoder->stat_size_max;
+}
+
+/* Returns the bit rate of the specified period. from and to must be the
+ * mm time of the first and last frame to consider.
+ */
+static uint64_t get_period_bit_rate(SpiceGstEncoder *encoder, uint32_t from,
+ uint32_t to)
+{
+ uint32_t sum = 0;
+ uint32_t last_mm_time = 0;
+ uint32_t index = encoder->history_last;
+ while (1) {
+ if (encoder->history[index].mm_time == to) {
+ if (last_mm_time == 0) {
+ /* We don't know how much time elapsed between the period's
+ * last frame and the next so we cannot include it.
+ */
+ sum = 1;
+ last_mm_time = to;
+ } else {
+ sum = encoder->history[index].size + 1;
+ }
+
+ } else if (encoder->history[index].mm_time == from) {
+ sum += encoder->history[index].size;
+ return (sum - 1) * 8 * MSEC_PER_SEC / (last_mm_time - from);
+
+ } else if (index == encoder->history_first) {
+ /* This period is outside the recorded history */
+ spice_debug("period (%u-%u) outside known history (%u-%u)",
+ from, to,
+ encoder->history[encoder->history_first].mm_time,
+ encoder->history[encoder->history_last].mm_time);
+ return 0;
+
+ } else if (sum > 0) {
+ sum += encoder->history[index].size;
+
+ } else {
+ last_mm_time = encoder->history[index].mm_time;
+ }
+ index = (index ? index : SPICE_GST_HISTORY_SIZE) - 1;
+ }
+
+}
+
+static void add_frame(SpiceGstEncoder *encoder, uint32_t frame_mm_time,
+ uint32_t size)
+{
+ /* Update the statistics */
+ uint32_t count = encoder->history_last +
+ (encoder->history_last < encoder->stat_first ? SPICE_GST_HISTORY_SIZE : 0) -
+ encoder->stat_first + 1;
+ if (count == SPICE_GST_FRAME_STATISTICS_COUNT) {
+ encoder->stat_size_sum -= encoder->history[encoder->stat_first].size;
+ if (encoder->stat_size_max == encoder->history[encoder->stat_first].size) {
+ encoder->stat_size_max = 0;
+ }
+ encoder->stat_first = (encoder->stat_first + 1) % SPICE_GST_HISTORY_SIZE;
+ }
+ encoder->stat_size_sum += size;
+ if (encoder->stat_size_max > 0 && size > encoder->stat_size_max) {
+ encoder->stat_size_max = size;
+ }
+
+ /* Update the frame history */
+ encoder->history_last = (encoder->history_last + 1) % SPICE_GST_HISTORY_SIZE;
+ if (encoder->history_last == encoder->history_first) {
+ encoder->history_first = (encoder->history_first + 1) % SPICE_GST_HISTORY_SIZE;
+ }
+ encoder->history[encoder->history_last].mm_time = frame_mm_time;
+ encoder->history[encoder->history_last].size = size;
+}
+
+
+/* ---------- Encoder bit rate control ---------- */
+
+static uint32_t get_min_playback_delay(SpiceGstEncoder *encoder)
+{
+ /* Make sure the delay is large enough to send a large frame (typically
+ * an I frame) and an average frame. This also takes into account the
+ * frames dropped by the encoder bit rate control.
+ */
+ uint32_t size = get_maximum_frame_size(encoder) + get_average_frame_size(encoder);
+ uint32_t send_time = MSEC_PER_SEC * size * 8 / encoder->bit_rate;
+
+ /* Also factor in the network latency with a margin for jitter. */
+ uint32_t net_latency = get_network_latency(encoder) * (1.0 + SPICE_GST_LATENCY_MARGIN);
+
+ return send_time + net_latency;
+}
+
+static void update_client_playback_delay(SpiceGstEncoder *encoder)
+{
+ if (encoder->cbs.update_client_playback_delay) {
+ uint32_t min_delay = get_min_playback_delay(encoder);
+ encoder->cbs.update_client_playback_delay(encoder->cbs.opaque, min_delay);
+ }
+}
+
+static void update_next_frame_mm_time(SpiceGstEncoder *encoder)
+{
+ if (encoder->vbuffer_free >= 0) {
+ encoder->next_frame_mm_time = 0;
+ return;
+ }
+
+ /* Figure out how many frames to drop to not exceed the current bit rate.
+ * Use nanoseconds to avoid precision loss.
+ */
+ uint64_t delay_ns = -encoder->vbuffer_free * 8 * NSEC_PER_SEC / encoder->bit_rate;
+ uint64_t period_ns = NSEC_PER_SEC / get_source_fps(encoder);
+ uint32_t drops = (delay_ns + period_ns - 1) / period_ns; /* round up */
+ spice_debug("drops=%u vbuffer %d/%d", drops, encoder->vbuffer_free,
+ encoder->vbuffer_size);
+
+ delay_ns = drops * period_ns + period_ns / 2;
+ if (delay_ns > SPICE_GST_MAX_PERIOD) {
+ delay_ns = SPICE_GST_MAX_PERIOD;
+ }
+ encoder->next_frame_mm_time = get_last_frame_mm_time(encoder) + delay_ns / NSEC_PER_MILLISEC;
+
+ /* Drops mean a higher delay between encoded frames so update the
+ * playback delay.
+ */
+ update_client_playback_delay(encoder);
+}
+
+
+/* ---------- Network bit rate control ---------- */
+
/* The maximum bit rate we will use for the current video.
*
* This is based on a 10x compression ratio which should be more than enough
@@ -717,10 +969,22 @@ static int spice_gst_encoder_encode_frame(VideoEncoder *video_encoder,
encoder->spice_format = bitmap->format;
encoder->width = width;
encoder->height = height;
- if (encoder->pipeline) {
+ if (encoder->bit_rate == 0) {
+ encoder->history[0].mm_time = frame_mm_time;
+ encoder->bit_rate = encoder->starting_bit_rate;
+ adjust_bit_rate(encoder);
+ encoder->vbuffer_free = 0; /* Slow start */
+ } else if (encoder->pipeline) {
reconfigure_pipeline(encoder);
}
}
+
+ if (rate_control_is_active(encoder) &&
+ frame_mm_time < encoder->next_frame_mm_time) {
+ /* Drop the frame to limit the outgoing bit rate. */
+ return VIDEO_ENCODER_FRAME_DROP;
+ }
+
if (!is_pipeline_configured(encoder) &&
!configure_pipeline(encoder, bitmap)) {
return VIDEO_ENCODER_FRAME_UNSUPPORTED;
@@ -736,9 +1000,16 @@ static int spice_gst_encoder_encode_frame(VideoEncoder *video_encoder,
free_pipeline(encoder);
}
}
-
/* Unref the last frame's bitmap_opaque structures if any */
clear_zero_copy_queue(encoder, FALSE);
+
+ if (rc != VIDEO_ENCODER_FRAME_ENCODE_DONE) {
+ return rc;
+ }
+ add_frame(encoder, frame_mm_time, (*outbuf)->size);
+
+ update_next_frame_mm_time(encoder);
+
return rc;
}

@@ -750,10 +1021,13 @@ static void spice_gst_encoder_client_stream_report(VideoEncoder *video_encoder,
int32_t end_frame_delay,
uint32_t audio_delay)
{
- spice_debug("client report: #frames %u, #drops %d, duration %u video-delay %d audio-delay %u",
- num_frames, num_drops,
- end_frame_mm_time - start_frame_mm_time,
- end_frame_delay, audio_delay);
+ SpiceGstEncoder *encoder = (SpiceGstEncoder*)video_encoder;
+ uint64_t period_bit_rate = get_period_bit_rate(encoder, start_frame_mm_time, end_frame_mm_time);
+ spice_debug("client report: %u/%u drops in %ums margins video %3d audio %3u bw %.3f/%.3fMbps",
+ num_drops, num_frames, end_frame_mm_time - start_frame_mm_time,
+ end_frame_delay, audio_delay,
+ get_mbps(period_bit_rate),
+ get_mbps(get_effective_bit_rate(encoder)));
}

static void spice_gst_encoder_notify_server_frame_drop(VideoEncoder *video_encoder)
@@ -764,7 +1038,7 @@ static void spice_gst_encoder_notify_server_frame_drop(VideoEncoder *video_encod
static uint64_t spice_gst_encoder_get_bit_rate(VideoEncoder *video_encoder)
{
SpiceGstEncoder *encoder = (SpiceGstEncoder*)video_encoder;
- return encoder->bit_rate;
+ return get_effective_bit_rate(encoder);
}

static void spice_gst_encoder_get_stats(VideoEncoder *video_encoder,
@@ -775,7 +1049,7 @@ static void spice_gst_encoder_get_stats(VideoEncoder *video_encoder,

spice_return_if_fail(stats != NULL);
stats->starting_bit_rate = encoder->starting_bit_rate;
- stats->cur_bit_rate = encoder->bit_rate;
+ stats->cur_bit_rate = get_effective_bit_rate(encoder);

/* Use the compression level as a proxy for the quality */
stats->avg_quality = stats->cur_bit_rate ? 100.0 - raw_bit_rate / stats->cur_bit_rate : 0;
@@ -790,6 +1064,7 @@ VideoEncoder *gstreamer_encoder_new(SpiceVideoCodecType codec_type,
bitmap_ref_t bitmap_ref,
bitmap_unref_t bitmap_unref)
{
+ spice_return_val_if_fail(SPICE_GST_FRAME_STATISTICS_COUNT <= SPICE_GST_HISTORY_SIZE, NULL);
spice_return_val_if_fail(codec_type == SPICE_VIDEO_CODEC_TYPE_MJPEG ||
codec_type == SPICE_VIDEO_CODEC_TYPE_VP8 ||
codec_type == SPICE_VIDEO_CODEC_TYPE_H264, NULL);
--
2.8.0.rc3
Francois Gouget
2016-04-05 15:16:04 UTC
Permalink
The video encoder uses the client reports and/or notifications of
server frame drops as its feedback mechanisms. In particular it keeps
track of the maximum video margin and reduces the bit rate whenever the
margin goes below certain thresholds or decreases too sharply.
It uses these to figure out the lowest bit rate that causes negative
feedback, and the highest bit rate that allows a return to positive
feedbacks. It then works to narrow this range and settles on the lower
end once the spread has gone below a given threshold.
All the while it monitors the effective bit rate to ensure the target
bit rate does not grow significantly beyond what the GStreamer encoder
will produce: this avoids target bit rate 'bubbles' which would
invariably be followed by a bit rate crash with accompanying frame loss.
As soon as the network feedback indicates a significant degradation the
bit rate is lowered to minimize the risk of frame loss and/or long
freezes.
It also relies on the existing shaping of the GStreamer output bit rate
to minimize the pipeline reconfigurations.

Signed-off-by: Francois Gouget <***@codeweavers.com>
---
server/gstreamer-encoder.c | 410 ++++++++++++++++++++++++++++++++++++++++++---
1 file changed, 384 insertions(+), 26 deletions(-)

diff --git a/server/gstreamer-encoder.c b/server/gstreamer-encoder.c
index e6ff3fa..28589c3 100644
--- a/server/gstreamer-encoder.c
+++ b/server/gstreamer-encoder.c
@@ -51,6 +51,12 @@ typedef struct {
uint32_t size;
} SpiceGstFrameInformation;

+typedef enum SpiceGstBitRateStatus {
+ SPICE_GST_BITRATE_DECREASING,
+ SPICE_GST_BITRATE_INCREASING,
+ SPICE_GST_BITRATE_STABLE,
+} SpiceGstBitRateStatus;
+
typedef struct SpiceGstEncoder {
VideoEncoder base;

@@ -91,6 +97,12 @@ typedef struct SpiceGstEncoder {
/* The frame counter for GStreamer buffers */
uint32_t frame;

+ /* The GStreamer bit rate. */
+ uint64_t video_bit_rate;
+
+ /* Don't bother changing the GStreamer bit rate if close enough. */
+# define SPICE_GST_VIDEO_BITRATE_MARGIN 0.05
+

/* ---------- Encoded frame statistics ---------- */

@@ -125,7 +137,7 @@ typedef struct SpiceGstEncoder {

/* ---------- Encoder bit rate control ----------
*
- * GStreamer encoders don't follow the specified bit rate very
+ * GStreamer encoders don't follow the specified video_bit_rate very
* closely. These fields are used to ensure we don't exceed the desired
* stream bit rate, regardless of the GStreamer encoder's output.
*/
@@ -133,7 +145,7 @@ typedef struct SpiceGstEncoder {
/* The bit rate target for the outgoing network stream. (bits per second) */
uint64_t bit_rate;

- /* The minimum bit rate */
+ /* The minimum bit rate / bit rate increment. */
# define SPICE_GST_MIN_BITRATE (128 * 1024)

/* The default bit rate */
@@ -159,6 +171,89 @@ typedef struct SpiceGstEncoder {

/* How big of a margin to take to cover for latency jitter. */
# define SPICE_GST_LATENCY_MARGIN 0.1
+
+
+ /* ---------- Network bit rate control ----------
+ *
+ * State information for figuring out the optimal bit rate for the current
+ * network conditions.
+ */
+
+ /* The mm_time of the last bit rate change. */
+ uint32_t last_change;
+
+ /* How much to reduce the bit rate in case of network congestion. */
+# define SPICE_GST_BITRATE_CUT 2
+# define SPICE_GST_BITRATE_REDUCE (4.0 / 3.0)
+
+ /* Never increase the bit rate by more than this amount (bits per second). */
+# define SPICE_GST_BITRATE_MAX_STEP (1024 * 1024)
+
+ /* The maximum bit rate that one can maybe use without causing network
+ * congestion.
+ */
+ uint64_t max_bit_rate;
+
+ /* The last bit rate that let us recover from network congestion. */
+ uint64_t min_bit_rate;
+
+ /* Defines when the spread between max_bit_rate and min_bit_rate has been
+ * narrowed down enough. Note that this value should be large enough for
+ * min_bit_rate to allow recovery from network congestion in a reasonable
+ * time frame, and to absorb transient traffic spikes (potentially from
+ * other sources).
+ * This is also used as a multiplier for the video_bit_rate so it does not
+ * have to be changed too often.
+ */
+# define SPICE_GST_BITRATE_MARGIN SPICE_GST_BITRATE_REDUCE
+
+ /* Whether the bit rate was last decreased, increased or kept stable. */
+ SpiceGstBitRateStatus status;
+
+ /* The network bit rate control uses an AIMD scheme (Additive Increase,
+ * Multiplicative Decrease). The increment step depends on the spread
+ * between the minimum and maximum bit rates.
+ */
+ uint64_t bit_rate_step;
+
+ /* How often to increase the bit rate. */
+ uint32_t increase_interval;
+
+# define SPICE_GST_BITRATE_UP_INTERVAL (MSEC_PER_SEC * 2)
+# define SPICE_GST_BITRATE_UP_CLIENT_STABLE (MSEC_PER_SEC * 60 * 2)
+# define SPICE_GST_BITRATE_UP_SERVER_STABLE (MSEC_PER_SEC * 3600 * 4)
+# define SPICE_GST_BITRATE_UP_RESET_MAX (MSEC_PER_SEC * 30)
+
+
+ /* ---------- Client feedback ---------- */
+
+ /* TRUE if gst_encoder_client_stream_report() is being called. */
+ gboolean has_client_reports;
+
+ /* The margin is the amount of time between the reception of a piece of
+ * media data by the client and the time when it should be played/displayed.
+ * Increasing the bit rate increases the transmission time and thus reduces
+ * the margin.
+ */
+ int32_t last_video_margin;
+ int32_t max_video_margin;
+ uint32_t max_audio_margin;
+
+# define SPICE_GST_VIDEO_MARGIN_GOOD 0.75
+# define SPICE_GST_VIDEO_MARGIN_AVERAGE 0.5
+# define SPICE_GST_VIDEO_MARGIN_BAD 0.3
+
+# define SPICE_GST_VIDEO_DELTA_BAD 0.2
+# define SPICE_GST_VIDEO_DELTA_AVERAGE 0.15
+
+# define SPICE_GST_AUDIO_MARGIN_BAD 0.5
+# define SPICE_GST_AUDIO_VIDEO_RATIO 1.25
+
+
+ /* ---------- Server feedback ---------- */
+
+ /* How many frames were dropped by the server since the last encoded frame. */
+ uint32_t server_drops;
} SpiceGstEncoder;


@@ -350,6 +445,16 @@ static void add_frame(SpiceGstEncoder *encoder, uint32_t frame_mm_time,

/* ---------- Encoder bit rate control ---------- */

+static void set_video_bit_rate(SpiceGstEncoder *encoder, uint64_t bit_rate)
+{
+ if (abs(bit_rate - encoder->video_bit_rate) > encoder->video_bit_rate * SPICE_GST_VIDEO_BITRATE_MARGIN) {
+ encoder->video_bit_rate = bit_rate;
+ if (is_pipeline_configured(encoder)) {
+ free_pipeline(encoder);
+ }
+ }
+}
+
static uint32_t get_min_playback_delay(SpiceGstEncoder *encoder)
{
/* Make sure the delay is large enough to send a large frame (typically
@@ -391,6 +496,12 @@ static void update_next_frame_mm_time(SpiceGstEncoder *encoder)

delay_ns = drops * period_ns + period_ns / 2;
if (delay_ns > SPICE_GST_MAX_PERIOD) {
+ /* Reduce the video bit rate so we don't have to drop so many frames. */
+ if (encoder->video_bit_rate > encoder->bit_rate * SPICE_GST_BITRATE_MARGIN) {
+ set_video_bit_rate(encoder, encoder->bit_rate * SPICE_GST_BITRATE_MARGIN);
+ } else {
+ set_video_bit_rate(encoder, encoder->bit_rate);
+ }
delay_ns = SPICE_GST_MAX_PERIOD;
}
encoder->next_frame_mm_time = get_last_frame_mm_time(encoder) + delay_ns / NSEC_PER_MILLISEC;
@@ -415,19 +526,165 @@ static uint64_t get_bit_rate_cap(SpiceGstEncoder *encoder)
return raw_frame_bits * get_source_fps(encoder) / 10;
}

-static void adjust_bit_rate(SpiceGstEncoder *encoder)
+static void set_bit_rate(SpiceGstEncoder *encoder, uint64_t bit_rate)
{
- if (encoder->bit_rate == 0) {
- /* Use the default value, */
- encoder->bit_rate = SPICE_GST_DEFAULT_BITRATE;
- } else if (encoder->bit_rate < SPICE_GST_MIN_BITRATE) {
- /* don't let the bit rate go too low */
+ if (bit_rate == 0) {
+ /* Use the default value */
+ bit_rate = SPICE_GST_DEFAULT_BITRATE;
+ }
+ if (bit_rate == encoder->bit_rate) {
+ return;
+ }
+ if (bit_rate < SPICE_GST_MIN_BITRATE) {
+ /* Don't let the bit rate go too low... */
encoder->bit_rate = SPICE_GST_MIN_BITRATE;
- } else {
+ } else if (bit_rate > encoder->bit_rate) {
/* or too high */
- encoder->bit_rate = MIN(encoder->bit_rate, get_bit_rate_cap(encoder));
+ bit_rate = MIN(bit_rate, get_bit_rate_cap(encoder));
+ }
+
+ if (bit_rate < encoder->min_bit_rate) {
+ encoder->min_bit_rate = bit_rate;
+ encoder->bit_rate_step = 0;
+ } else if (encoder->status == SPICE_GST_BITRATE_DECREASING &&
+ bit_rate > encoder->bit_rate) {
+ encoder->min_bit_rate = encoder->bit_rate;
+ encoder->bit_rate_step = 0;
+ } else if (encoder->status != SPICE_GST_BITRATE_DECREASING &&
+ bit_rate < encoder->bit_rate) {
+ encoder->max_bit_rate = encoder->bit_rate - SPICE_GST_MIN_BITRATE;
+ encoder->bit_rate_step = 0;
+ }
+ encoder->increase_interval = SPICE_GST_BITRATE_UP_INTERVAL;
+
+ if (encoder->bit_rate_step == 0) {
+ encoder->bit_rate_step = MAX(SPICE_GST_MIN_BITRATE,
+ MIN(SPICE_GST_BITRATE_MAX_STEP,
+ (encoder->max_bit_rate - encoder->min_bit_rate) / 10));
+ encoder->status = (bit_rate < encoder->bit_rate) ? SPICE_GST_BITRATE_DECREASING : SPICE_GST_BITRATE_INCREASING;
+ if (encoder->max_bit_rate / SPICE_GST_BITRATE_MARGIN < encoder->min_bit_rate) {
+ /* We have sufficiently narrowed down the optimal bit rate range.
+ * Settle on the lower end to keep a safety margin and stop rocking
+ * the boat.
+ */
+ bit_rate = encoder->min_bit_rate;
+ encoder->status = SPICE_GST_BITRATE_STABLE;
+ encoder->increase_interval = encoder->has_client_reports ? SPICE_GST_BITRATE_UP_CLIENT_STABLE : SPICE_GST_BITRATE_UP_SERVER_STABLE;
+ set_video_bit_rate(encoder, bit_rate);
+ }
+ }
+ spice_debug("%u set_bit_rate(%.3fMbps) eff %.3f %.3f-%.3f %d",
+ get_last_frame_mm_time(encoder) - encoder->last_change,
+ get_mbps(bit_rate), get_mbps(get_effective_bit_rate(encoder)),
+ get_mbps(encoder->min_bit_rate),
+ get_mbps(encoder->max_bit_rate), encoder->status);
+
+ encoder->last_change = get_last_frame_mm_time(encoder);
+ encoder->bit_rate = bit_rate;
+ /* Adjust the vbuffer size without ever increasing vbuffer_free to avoid
+ * sudden bit rate increases.
+ */
+ int32_t new_size = bit_rate * SPICE_GST_VBUFFER_SIZE / MSEC_PER_SEC / 8;
+ if (new_size < encoder->vbuffer_size && encoder->vbuffer_free > 0) {
+ encoder->vbuffer_free = MAX(0, encoder->vbuffer_free + new_size - encoder->vbuffer_size);
+ }
+ encoder->vbuffer_size = new_size;
+ update_next_frame_mm_time(encoder);
+
+ /* Frames preceeding the bit rate change are not relevant to the current
+ * situation anymore.
+ */
+ encoder->stat_first = encoder->history_last;
+ encoder->stat_size_sum = encoder->stat_size_max = encoder->history[encoder->history_last].size;
+
+ if (bit_rate > encoder->video_bit_rate) {
+ set_video_bit_rate(encoder, bit_rate * SPICE_GST_BITRATE_MARGIN);
+ }
+}
+
+static void increase_bit_rate(SpiceGstEncoder *encoder)
+{
+ if (get_effective_bit_rate(encoder) < encoder->bit_rate) {
+ /* The GStreamer encoder currently uses less bandwidth than allowed.
+ * So increasing the limit again makes no sense.
+ */
+ return;
+ }
+
+ if (encoder->bit_rate == encoder->max_bit_rate &&
+ get_last_frame_mm_time(encoder) - encoder->last_change > SPICE_GST_BITRATE_UP_RESET_MAX) {
+ /* The maximum bit rate seems to be sustainable so it was probably set
+ * too low. Probe for the maximum bit rate again.
+ */
+ encoder->max_bit_rate = get_bit_rate_cap(encoder);
+ encoder->status = SPICE_GST_BITRATE_INCREASING;
+ }
+
+ uint64_t new_bit_rate = MIN(encoder->bit_rate + encoder->bit_rate_step,
+ encoder->max_bit_rate);
+ spice_debug("increase bit rate to %.3fMbps %.3f-%.3fMbps %d",
+ get_mbps(new_bit_rate), get_mbps(encoder->min_bit_rate),
+ get_mbps(encoder->max_bit_rate), encoder->status);
+ set_bit_rate(encoder, new_bit_rate);
+}
+
+
+/* ---------- Server feedback ---------- */
+
+/* A helper for gst_encoder_encode_frame()
+ *
+ * Checks how many frames got dropped since the last encoded frame and adjusts
+ * the bit rate accordingly.
+ */
+static inline gboolean handle_server_drops(SpiceGstEncoder *encoder,
+ uint32_t frame_mm_time)
+{
+ if (encoder->server_drops == 0) {
+ return FALSE;
+ }
+
+ spice_debug("server report: got %u drops in %ums after %ums",
+ encoder->server_drops,
+ frame_mm_time - get_last_frame_mm_time(encoder),
+ frame_mm_time - encoder->last_change);
+
+ /* The server dropped a frame so clearly the buffer is full. */
+ encoder->vbuffer_free = MIN(encoder->vbuffer_free, 0);
+ /* Add a 0 byte frame so the time spent dropping frames is not counted as
+ * time during which the buffer was refilling. This implies dropping this
+ * frame.
+ */
+ add_frame(encoder, frame_mm_time, 0);
+
+ if (encoder->server_drops >= get_source_fps(encoder)) {
+ spice_debug("cut the bit rate");
+ uint64_t bit_rate = (encoder->bit_rate == encoder->min_bit_rate) ?
+ encoder->bit_rate / SPICE_GST_BITRATE_CUT :
+ MAX(encoder->min_bit_rate, encoder->bit_rate / SPICE_GST_BITRATE_CUT);
+ set_bit_rate(encoder, bit_rate);
+
+ } else {
+ spice_debug("reduce the bit rate");
+ uint64_t bit_rate = (encoder->bit_rate == encoder->min_bit_rate) ?
+ encoder->bit_rate / SPICE_GST_BITRATE_REDUCE :
+ MAX(encoder->min_bit_rate, encoder->bit_rate / SPICE_GST_BITRATE_REDUCE);
+ set_bit_rate(encoder, bit_rate);
+ }
+ encoder->server_drops = 0;
+ return TRUE;
+}
+
+/* A helper for gst_encoder_encode_frame() */
+static inline void server_increase_bit_rate(SpiceGstEncoder *encoder,
+ uint32_t frame_mm_time)
+{
+ /* Let gst_encoder_client_stream_report() deal with bit rate increases if
+ * we receive client reports.
+ */
+ if (!encoder->has_client_reports && encoder->server_drops == 0 &&
+ frame_mm_time - encoder->last_change >= encoder->increase_interval) {
+ increase_bit_rate(encoder);
}
- spice_debug("adjust_bit_rate(%.3fMbps)", get_mbps(encoder->bit_rate));
}


@@ -572,12 +829,10 @@ static gboolean configure_pipeline(SpiceGstEncoder *encoder,
}

/* Configure the encoder bitrate */
- adjust_bit_rate(encoder);
- switch (encoder->base.codec_type)
- {
+ switch (encoder->base.codec_type) {
case SPICE_VIDEO_CODEC_TYPE_MJPEG:
g_object_set(G_OBJECT(encoder->gstenc),
- "bitrate", (gint)encoder->bit_rate,
+ "bitrate", (gint)encoder->video_bit_rate,
NULL);
/* See https://bugzilla.gnome.org/show_bug.cgi?id=753257 */
spice_debug("removing the pipeline clock");
@@ -585,12 +840,12 @@ static gboolean configure_pipeline(SpiceGstEncoder *encoder,
break;
case SPICE_VIDEO_CODEC_TYPE_VP8:
g_object_set(G_OBJECT(encoder->gstenc),
- "target-bitrate", (gint)encoder->bit_rate,
+ "target-bitrate", (gint)encoder->video_bit_rate,
NULL);
break;
case SPICE_VIDEO_CODEC_TYPE_H264:
g_object_set(G_OBJECT(encoder->gstenc),
- "bitrate", encoder->bit_rate / 1024,
+ "bitrate", (guint)(encoder->bit_rate / 1024),
NULL);
break;
default:
@@ -971,8 +1226,10 @@ static int spice_gst_encoder_encode_frame(VideoEncoder *video_encoder,
encoder->height = height;
if (encoder->bit_rate == 0) {
encoder->history[0].mm_time = frame_mm_time;
- encoder->bit_rate = encoder->starting_bit_rate;
- adjust_bit_rate(encoder);
+ encoder->max_bit_rate = get_bit_rate_cap(encoder);
+ encoder->min_bit_rate = SPICE_GST_MIN_BITRATE;
+ encoder->status = SPICE_GST_BITRATE_DECREASING;
+ set_bit_rate(encoder, encoder->starting_bit_rate);
encoder->vbuffer_free = 0; /* Slow start */
} else if (encoder->pipeline) {
reconfigure_pipeline(encoder);
@@ -980,7 +1237,8 @@ static int spice_gst_encoder_encode_frame(VideoEncoder *video_encoder,
}

if (rate_control_is_active(encoder) &&
- frame_mm_time < encoder->next_frame_mm_time) {
+ (handle_server_drops(encoder, frame_mm_time) ||
+ frame_mm_time < encoder->next_frame_mm_time)) {
/* Drop the frame to limit the outgoing bit rate. */
return VIDEO_ENCODER_FRAME_DROP;
}
@@ -1006,8 +1264,14 @@ static int spice_gst_encoder_encode_frame(VideoEncoder *video_encoder,
if (rc != VIDEO_ENCODER_FRAME_ENCODE_DONE) {
return rc;
}
+ uint32_t last_mm_time = get_last_frame_mm_time(encoder);
add_frame(encoder, frame_mm_time, (*outbuf)->size);

+ int32_t refill = encoder->bit_rate * (frame_mm_time - last_mm_time) / MSEC_PER_SEC / 8;
+ encoder->vbuffer_free = MIN(encoder->vbuffer_free + refill,
+ encoder->vbuffer_size) - (*outbuf)->size;
+
+ server_increase_bit_rate(encoder, frame_mm_time);
update_next_frame_mm_time(encoder);

return rc;
@@ -1018,21 +1282,115 @@ static void spice_gst_encoder_client_stream_report(VideoEncoder *video_encoder,
uint32_t num_drops,
uint32_t start_frame_mm_time,
uint32_t end_frame_mm_time,
- int32_t end_frame_delay,
- uint32_t audio_delay)
+ int32_t video_margin,
+ uint32_t audio_margin)
{
SpiceGstEncoder *encoder = (SpiceGstEncoder*)video_encoder;
+ encoder->has_client_reports = TRUE;
+
+ encoder->max_video_margin = MAX(encoder->max_video_margin, video_margin);
+ encoder->max_audio_margin = MAX(encoder->max_audio_margin, audio_margin);
+ int32_t margin_delta = video_margin - encoder->last_video_margin;
+ encoder->last_video_margin = video_margin;
+
uint64_t period_bit_rate = get_period_bit_rate(encoder, start_frame_mm_time, end_frame_mm_time);
- spice_debug("client report: %u/%u drops in %ums margins video %3d audio %3u bw %.3f/%.3fMbps",
+ spice_debug("client report: %u/%u drops in %ums margins video %3d/%3d audio %3u/%3u bw %.3f/%.3fMbps%s",
num_drops, num_frames, end_frame_mm_time - start_frame_mm_time,
- end_frame_delay, audio_delay,
+ video_margin, encoder->max_video_margin,
+ audio_margin, encoder->max_audio_margin,
get_mbps(period_bit_rate),
- get_mbps(get_effective_bit_rate(encoder)));
+ get_mbps(get_effective_bit_rate(encoder)),
+ start_frame_mm_time < encoder->last_change ? " obsolete" : "");
+ if (encoder->status == SPICE_GST_BITRATE_DECREASING &&
+ start_frame_mm_time < encoder->last_change) {
+ /* Some of this data predates the last bit rate reduction
+ * so it is obsolete.
+ */
+ return;
+ }
+
+ /* We normally arrange for even the largest frames to arrive a bit over
+ * one period before they should be displayed.
+ */
+ uint32_t min_margin = MSEC_PER_SEC / get_source_fps(encoder) +
+ get_network_latency(encoder) * SPICE_GST_LATENCY_MARGIN;
+
+ /* A low video margin indicates that the bit rate is too high. */
+ uint32_t score;
+ if (num_drops) {
+ score = 4;
+ } else if (margin_delta >= 0) {
+ /* The situation was bad but seems to be improving */
+ score = 0;
+ } else if (video_margin < min_margin * SPICE_GST_VIDEO_MARGIN_BAD ||
+ video_margin < encoder->max_video_margin * SPICE_GST_VIDEO_MARGIN_BAD) {
+ score = 3;
+ } else if (video_margin < min_margin ||
+ video_margin < encoder->max_video_margin * SPICE_GST_VIDEO_MARGIN_AVERAGE) {
+ score = 2;
+ } else if (video_margin < encoder->max_video_margin * SPICE_GST_VIDEO_MARGIN_GOOD) {
+ score = 1;
+ } else {
+ score = 0;
+ }
+ /* A fast dropping video margin is a compounding factor. */
+ if (margin_delta < -abs(encoder->max_video_margin) * SPICE_GST_VIDEO_DELTA_BAD) {
+ score += 2;
+ } else if (margin_delta < -abs(encoder->max_video_margin) * SPICE_GST_VIDEO_DELTA_AVERAGE) {
+ score += 1;
+ }
+
+ if (score > 3) {
+ spice_debug("score %u, cut the bit rate", score);
+ uint64_t bit_rate = (encoder->bit_rate == encoder->min_bit_rate) ?
+ encoder->bit_rate / SPICE_GST_BITRATE_CUT :
+ MAX(encoder->min_bit_rate, encoder->bit_rate / SPICE_GST_BITRATE_CUT);
+ set_bit_rate(encoder, bit_rate);
+
+ } else if (score == 3) {
+ spice_debug("score %u, reduce the bit rate", score);
+ uint64_t bit_rate = (encoder->bit_rate == encoder->min_bit_rate) ?
+ encoder->bit_rate / SPICE_GST_BITRATE_REDUCE :
+ MAX(encoder->min_bit_rate, encoder->bit_rate / SPICE_GST_BITRATE_REDUCE);
+ set_bit_rate(encoder, bit_rate);
+
+ } else if (score == 2) {
+ spice_debug("score %u, decrement the bit rate", score);
+ set_bit_rate(encoder, encoder->bit_rate - encoder->bit_rate_step);
+
+ } else if (audio_margin < encoder->max_audio_margin * SPICE_GST_AUDIO_MARGIN_BAD &&
+ audio_margin * SPICE_GST_AUDIO_VIDEO_RATIO < video_margin) {
+ /* The audio margin has decreased a lot while the video_margin
+ * remained higher. It may be that the video stream is starving the
+ * audio one of bandwidth. So reduce the bit rate.
+ */
+ spice_debug("free some bandwidth for the audio stream");
+ set_bit_rate(encoder, encoder->bit_rate - encoder->bit_rate_step);
+
+ } else if (score == 1 && period_bit_rate <= encoder->bit_rate &&
+ encoder->status == SPICE_GST_BITRATE_INCREASING) {
+ /* We only increase the bit rate when score == 0 so things got worse
+ * since the last increase, and not because of a transient bit rate
+ * peak.
+ */
+ spice_debug("degraded margin, decrement bit rate %.3f <= %.3fMbps",
+ get_mbps(period_bit_rate), get_mbps(encoder->bit_rate));
+ set_bit_rate(encoder, encoder->bit_rate - encoder->bit_rate_step);
+
+ } else if (score == 0 &&
+ get_last_frame_mm_time(encoder) - encoder->last_change >= encoder->increase_interval) {
+ /* The video margin is consistently high so increase the bit rate. */
+ increase_bit_rate(encoder);
+ }
}

static void spice_gst_encoder_notify_server_frame_drop(VideoEncoder *video_encoder)
{
- spice_debug("server report: getting frame drops...");
+ SpiceGstEncoder *encoder = (SpiceGstEncoder*)video_encoder;
+ if (encoder->server_drops == 0) {
+ spice_debug("server report: getting frame drops...");
+ }
+ encoder->server_drops++;
}

static uint64_t spice_gst_encoder_get_bit_rate(VideoEncoder *video_encoder)
--
2.8.0.rc3
Francois Gouget
2016-04-05 15:16:12 UTC
Permalink
Signed-off-by: Francois Gouget <***@codeweavers.com>
---
server/gstreamer-encoder.c | 40 +++++++++++++++++++++++++++++++++-------
1 file changed, 33 insertions(+), 7 deletions(-)

diff --git a/server/gstreamer-encoder.c b/server/gstreamer-encoder.c
index 28589c3..d71c26c 100644
--- a/server/gstreamer-encoder.c
+++ b/server/gstreamer-encoder.c
@@ -48,6 +48,7 @@ typedef struct SpiceGstVideoBuffer {

typedef struct {
uint32_t mm_time;
+ uint64_t duration;
uint32_t size;
} SpiceGstFrameInformation;

@@ -128,6 +129,9 @@ typedef struct SpiceGstEncoder {
/* The index of the oldest frame taken into account for the statistics. */
uint32_t stat_first;

+ /* Used to compute the average frame encoding time. */
+ uint64_t stat_duration_sum;
+
/* Used to compute the average frame size. */
uint64_t stat_size_sum;

@@ -346,6 +350,14 @@ static uint64_t get_effective_bit_rate(SpiceGstEncoder *encoder)
return elapsed ? encoder->stat_size_sum * 8 * MSEC_PER_SEC / elapsed : 0;
}

+static uint64_t get_average_encoding_time(SpiceGstEncoder *encoder)
+{
+ uint32_t count = encoder->history_last +
+ (encoder->history_last < encoder->stat_first ? SPICE_GST_HISTORY_SIZE : 0) -
+ encoder->stat_first + 1;
+ return encoder->stat_duration_sum / count;
+}
+
static uint64_t get_average_frame_size(SpiceGstEncoder *encoder)
{
uint32_t count = encoder->history_last +
@@ -415,19 +427,21 @@ static uint64_t get_period_bit_rate(SpiceGstEncoder *encoder, uint32_t from,
}

static void add_frame(SpiceGstEncoder *encoder, uint32_t frame_mm_time,
- uint32_t size)
+ uint64_t duration, uint32_t size)
{
/* Update the statistics */
uint32_t count = encoder->history_last +
(encoder->history_last < encoder->stat_first ? SPICE_GST_HISTORY_SIZE : 0) -
encoder->stat_first + 1;
if (count == SPICE_GST_FRAME_STATISTICS_COUNT) {
+ encoder->stat_duration_sum -= encoder->history[encoder->stat_first].duration;
encoder->stat_size_sum -= encoder->history[encoder->stat_first].size;
if (encoder->stat_size_max == encoder->history[encoder->stat_first].size) {
encoder->stat_size_max = 0;
}
encoder->stat_first = (encoder->stat_first + 1) % SPICE_GST_HISTORY_SIZE;
}
+ encoder->stat_duration_sum += duration;
encoder->stat_size_sum += size;
if (encoder->stat_size_max > 0 && size > encoder->stat_size_max) {
encoder->stat_size_max = size;
@@ -439,6 +453,7 @@ static void add_frame(SpiceGstEncoder *encoder, uint32_t frame_mm_time,
encoder->history_first = (encoder->history_first + 1) % SPICE_GST_HISTORY_SIZE;
}
encoder->history[encoder->history_last].mm_time = frame_mm_time;
+ encoder->history[encoder->history_last].duration = duration;
encoder->history[encoder->history_last].size = size;
}

@@ -473,15 +488,23 @@ static uint32_t get_min_playback_delay(SpiceGstEncoder *encoder)
static void update_client_playback_delay(SpiceGstEncoder *encoder)
{
if (encoder->cbs.update_client_playback_delay) {
- uint32_t min_delay = get_min_playback_delay(encoder);
+ uint32_t min_delay = get_min_playback_delay(encoder) + get_average_encoding_time(encoder) / NSEC_PER_MILLISEC;
encoder->cbs.update_client_playback_delay(encoder->cbs.opaque, min_delay);
}
}

static void update_next_frame_mm_time(SpiceGstEncoder *encoder)
{
+ uint64_t period_ns = NSEC_PER_SEC / get_source_fps(encoder);
+ uint64_t min_delay_ns = get_average_encoding_time(encoder);
+ if (min_delay_ns > period_ns) {
+ spice_warning("your system seems to be too slow to encode this %dx%d video in real time", encoder->width, encoder->height);
+ }
+
+ min_delay_ns = MIN(min_delay_ns, SPICE_GST_MAX_PERIOD);
if (encoder->vbuffer_free >= 0) {
- encoder->next_frame_mm_time = 0;
+ encoder->next_frame_mm_time = get_last_frame_mm_time(encoder) +
+ min_delay_ns / NSEC_PER_MILLISEC;
return;
}

@@ -489,7 +512,6 @@ static void update_next_frame_mm_time(SpiceGstEncoder *encoder)
* Use nanoseconds to avoid precision loss.
*/
uint64_t delay_ns = -encoder->vbuffer_free * 8 * NSEC_PER_SEC / encoder->bit_rate;
- uint64_t period_ns = NSEC_PER_SEC / get_source_fps(encoder);
uint32_t drops = (delay_ns + period_ns - 1) / period_ns; /* round up */
spice_debug("drops=%u vbuffer %d/%d", drops, encoder->vbuffer_free,
encoder->vbuffer_size);
@@ -504,7 +526,8 @@ static void update_next_frame_mm_time(SpiceGstEncoder *encoder)
}
delay_ns = SPICE_GST_MAX_PERIOD;
}
- encoder->next_frame_mm_time = get_last_frame_mm_time(encoder) + delay_ns / NSEC_PER_MILLISEC;
+ encoder->next_frame_mm_time = get_last_frame_mm_time(encoder) +
+ MAX(delay_ns, min_delay_ns) / NSEC_PER_MILLISEC;

/* Drops mean a higher delay between encoded frames so update the
* playback delay.
@@ -595,6 +618,7 @@ static void set_bit_rate(SpiceGstEncoder *encoder, uint64_t bit_rate)
* situation anymore.
*/
encoder->stat_first = encoder->history_last;
+ encoder->stat_duration_sum = encoder->history[encoder->history_last].duration;
encoder->stat_size_sum = encoder->stat_size_max = encoder->history[encoder->history_last].size;

if (bit_rate > encoder->video_bit_rate) {
@@ -654,7 +678,7 @@ static inline gboolean handle_server_drops(SpiceGstEncoder *encoder,
* time during which the buffer was refilling. This implies dropping this
* frame.
*/
- add_frame(encoder, frame_mm_time, 0);
+ add_frame(encoder, frame_mm_time, 0, 0);

if (encoder->server_drops >= get_source_fps(encoder)) {
spice_debug("cut the bit rate");
@@ -1248,6 +1272,7 @@ static int spice_gst_encoder_encode_frame(VideoEncoder *video_encoder,
return VIDEO_ENCODER_FRAME_UNSUPPORTED;
}

+ uint64_t start = spice_get_monotonic_time_ns();
int rc = push_raw_frame(encoder, bitmap, src, top_down, bitmap_opaque);
if (rc == VIDEO_ENCODER_FRAME_ENCODE_DONE) {
rc = pull_compressed_buffer(encoder, outbuf);
@@ -1265,7 +1290,8 @@ static int spice_gst_encoder_encode_frame(VideoEncoder *video_encoder,
return rc;
}
uint32_t last_mm_time = get_last_frame_mm_time(encoder);
- add_frame(encoder, frame_mm_time, (*outbuf)->size);
+ add_frame(encoder, frame_mm_time, spice_get_monotonic_time_ns() - start,
+ (*outbuf)->size);

int32_t refill = encoder->bit_rate * (frame_mm_time - last_mm_time) / MSEC_PER_SEC / 8;
encoder->vbuffer_free = MIN(encoder->vbuffer_free + refill,
--
2.8.0.rc3
Francois Gouget
2016-04-05 15:16:15 UTC
Permalink
This typically happens when sending very small frames (less than
16 pixels in one dimension) to the x264enc encoder.
This avoids repeatedly wasting time rebuilding the pipeline.

Signed-off-by: Francois Gouget <***@codeweavers.com>
---
server/gstreamer-encoder.c | 53 ++++++++++++++++++++++++++++++++++++++++------
1 file changed, 46 insertions(+), 7 deletions(-)

diff --git a/server/gstreamer-encoder.c b/server/gstreamer-encoder.c
index d71c26c..4a7e5be 100644
--- a/server/gstreamer-encoder.c
+++ b/server/gstreamer-encoder.c
@@ -82,6 +82,9 @@ typedef struct SpiceGstEncoder {
const SpiceFormatForGStreamer *format;
SpiceBitmapFmt spice_format;

+ /* Number of consecutive frame encoding errors. */
+ uint32_t errors;
+
/* ---------- GStreamer pipeline ---------- */

/* Pointers to the GStreamer pipeline elements. If pipeline is NULL the
@@ -775,15 +778,35 @@ static int get_physical_core_count(void)
return physical_core_count;
}

-/* A helper for spice_gst_encoder_encode_frame() */
+static const gchar* get_gst_codec_name(SpiceGstEncoder *encoder)
+{
+ switch (encoder->base.codec_type)
+ {
+ case SPICE_VIDEO_CODEC_TYPE_MJPEG:
+ return "avenc_mjpeg";
+ case SPICE_VIDEO_CODEC_TYPE_VP8:
+ return "vp8enc";
+ case SPICE_VIDEO_CODEC_TYPE_H264:
+ return "x264enc";
+ default:
+ /* gstreamer_encoder_new() should have rejected this codec type */
+ spice_warning("unsupported codec type %d", encoder->base.codec_type);
+ return NULL;
+ }
+}
+
static gboolean create_pipeline(SpiceGstEncoder *encoder)
{
- gchar *gstenc;
+ const gchar* gstenc_name = get_gst_codec_name(encoder);
+ if (!gstenc_name) {
+ return FALSE;
+ }
+ gchar* gstenc_opts;
switch (encoder->base.codec_type)
{
case SPICE_VIDEO_CODEC_TYPE_MJPEG:
/* Set max-threads to ensure zero-frame latency */
- gstenc = g_strdup("avenc_mjpeg max-threads=1");
+ gstenc_opts = g_strdup("max-threads=1");
break;
case SPICE_VIDEO_CODEC_TYPE_VP8: {
/* See http://www.webmproject.org/docs/encoder-parameters/
@@ -805,7 +828,7 @@ static gboolean create_pipeline(SpiceGstEncoder *encoder)
*/
int threads = get_physical_core_count();
int parts = threads < 2 ? 0 : threads < 4 ? 1 : threads < 8 ? 2 : 3;
- gstenc = g_strdup_printf("vp8enc end-usage=cbr min-quantizer=10 resize-allowed=true error-resilient=true lag-in-frames=0 deadline=1 cpu-used=4 threads=%d token-partitions=%d", threads, parts);
+ gstenc_opts = g_strdup_printf("end-usage=cbr min-quantizer=10 resize-allowed=true error-resilient=true lag-in-frames=0 deadline=1 cpu-used=4 threads=%d token-partitions=%d", threads, parts);
break;
}
case SPICE_VIDEO_CODEC_TYPE_H264:
@@ -815,7 +838,7 @@ static gboolean create_pipeline(SpiceGstEncoder *encoder)
* - Set intra-refresh to get more uniform compressed frame sizes,
* thus helping with streaming.
*/
- gstenc = g_strdup("x264enc byte-stream=true aud=true qp-min=15 tune=4 sliced-threads=true speed-preset=ultrafast intra-refresh=true");
+ gstenc_opts = g_strdup("byte-stream=true aud=true qp-min=15 tune=4 sliced-threads=true speed-preset=ultrafast intra-refresh=true");
break;
default:
/* gstreamer_encoder_new() should have rejected this codec type */
@@ -824,10 +847,10 @@ static gboolean create_pipeline(SpiceGstEncoder *encoder)
}

GError *err = NULL;
- gchar *desc = g_strdup_printf("appsrc is-live=true format=time do-timestamp=true name=src ! videoconvert ! %s name=encoder ! appsink name=sink", gstenc);
+ gchar *desc = g_strdup_printf("appsrc is-live=true format=time do-timestamp=true name=src ! videoconvert ! %s %s name=encoder ! appsink name=sink", gstenc_name, gstenc_opts);
spice_debug("GStreamer pipeline: %s", desc);
encoder->pipeline = gst_parse_launch_full(desc, NULL, GST_PARSE_FLAG_FATAL_ERRORS, &err);
- g_free(gstenc);
+ g_free(gstenc_opts);
g_free(desc);
if (!encoder->pipeline || err) {
spice_warning("GStreamer error: %s", err->message);
@@ -1243,6 +1266,7 @@ static int spice_gst_encoder_encode_frame(VideoEncoder *video_encoder,
encoder->format = map_format(bitmap->format);
if (!encoder->format) {
spice_warning("unable to map format type %d", bitmap->format);
+ encoder->errors = 4;
return VIDEO_ENCODER_FRAME_UNSUPPORTED;
}
encoder->spice_format = bitmap->format;
@@ -1258,6 +1282,19 @@ static int spice_gst_encoder_encode_frame(VideoEncoder *video_encoder,
} else if (encoder->pipeline) {
reconfigure_pipeline(encoder);
}
+ encoder->errors = 0;
+ } else if (encoder->errors >= 3) {
+ /* The pipeline keeps failing to handle the frames we send it, which
+ * is usually because they are too small (mouse pointer-sized).
+ * So give up until something changes.
+ */
+ if (encoder->errors == 3) {
+ spice_debug("%s cannot compress %dx%d:%dbpp frames",
+ get_gst_codec_name(encoder), encoder->width,
+ encoder->height, encoder->format->bpp);
+ encoder->errors++;
+ }
+ return VIDEO_ENCODER_FRAME_UNSUPPORTED;
}

if (rate_control_is_active(encoder) &&
@@ -1269,6 +1306,7 @@ static int spice_gst_encoder_encode_frame(VideoEncoder *video_encoder,

if (!is_pipeline_configured(encoder) &&
!configure_pipeline(encoder, bitmap)) {
+ encoder->errors++;
return VIDEO_ENCODER_FRAME_UNSUPPORTED;
}

@@ -1281,6 +1319,7 @@ static int spice_gst_encoder_encode_frame(VideoEncoder *video_encoder,
* later ones from being processed. So reset the pipeline.
*/
free_pipeline(encoder);
+ encoder->errors++;
}
}
/* Unref the last frame's bitmap_opaque structures if any */
--
2.8.0.rc3
Francois Gouget
2016-04-05 15:16:20 UTC
Permalink
GObject returns an error instead of clamping if given an out of range
property value.

Signed-off-by: Francois Gouget <***@codeweavers.com>
---
server/gstreamer-encoder.c | 90 +++++++++++++++++++++++++++++++++++-----------
1 file changed, 69 insertions(+), 21 deletions(-)

diff --git a/server/gstreamer-encoder.c b/server/gstreamer-encoder.c
index 4a7e5be..3ff6343 100644
--- a/server/gstreamer-encoder.c
+++ b/server/gstreamer-encoder.c
@@ -20,6 +20,8 @@
#include <config.h>
#endif

+#include <inttypes.h>
+
#include <gst/gst.h>
#include <gst/app/gstappsrc.h>
#include <gst/app/gstappsink.h>
@@ -867,6 +869,59 @@ static gboolean create_pipeline(SpiceGstEncoder *encoder)
return TRUE;
}

+/* A helper for configure_pipeline() */
+static void set_gstenc_bitrate(SpiceGstEncoder *encoder)
+{
+ GObjectClass *class = G_OBJECT_GET_CLASS(encoder->gstenc);
+ GParamSpec *param = g_object_class_find_property(class, "bitrate");
+ if (param == NULL) {
+ param = g_object_class_find_property(class, "target-bitrate");
+ }
+ if (param) {
+ uint64_t gst_bit_rate = encoder->video_bit_rate;
+ if (strstr(g_param_spec_get_blurb(param), "kbit")) {
+ gst_bit_rate = gst_bit_rate / 1024;
+ }
+
+ GObject * gobject = G_OBJECT(encoder->gstenc);
+ const gchar *prop = g_param_spec_get_name(param);
+ switch (param->value_type) {
+ case G_TYPE_INT: {
+ GParamSpecInt *range = G_PARAM_SPEC_INT(param);
+ gst_bit_rate = MAX(range->minimum, MIN(range->maximum, gst_bit_rate));
+ g_object_set(gobject, prop, (gint)gst_bit_rate, NULL);
+ break;
+ }
+ case G_TYPE_UINT: {
+ GParamSpecUInt *range = G_PARAM_SPEC_UINT(param);
+ gst_bit_rate = MAX(range->minimum, MIN(range->maximum, gst_bit_rate));
+ g_object_set(gobject, prop, (guint)gst_bit_rate, NULL);
+ break;
+ }
+ case G_TYPE_INT64: {
+ GParamSpecInt64 *range = G_PARAM_SPEC_INT64(param);
+ gst_bit_rate = MAX(range->minimum, MIN(range->maximum, gst_bit_rate));
+ g_object_set(gobject, prop, (gint64)gst_bit_rate, NULL);
+ break;
+ }
+ case G_TYPE_UINT64: {
+ GParamSpecUInt64 *range = G_PARAM_SPEC_UINT64(param);
+ gst_bit_rate = MAX(range->minimum, MIN(range->maximum, gst_bit_rate));
+ g_object_set(gobject, prop, (guint64)gst_bit_rate, NULL);
+ break;
+ }
+ default:
+ spice_warning("the %s property has an unsupported type %zu",
+ prop, param->value_type);
+ return;
+ }
+ spice_debug("setting the GStreamer %s to %"PRIu64, prop, gst_bit_rate);
+ } else {
+ spice_printerr("Could not find the bit rate property for %s",
+ get_gst_codec_name(encoder));
+ }
+}
+
/* A helper for spice_gst_encoder_encode_frame() */
static gboolean configure_pipeline(SpiceGstEncoder *encoder,
const SpiceBitmap *bitmap)
@@ -876,30 +931,11 @@ static gboolean configure_pipeline(SpiceGstEncoder *encoder,
}

/* Configure the encoder bitrate */
- switch (encoder->base.codec_type) {
- case SPICE_VIDEO_CODEC_TYPE_MJPEG:
- g_object_set(G_OBJECT(encoder->gstenc),
- "bitrate", (gint)encoder->video_bit_rate,
- NULL);
+ set_gstenc_bitrate(encoder);
+ if (encoder->base.codec_type == SPICE_VIDEO_CODEC_TYPE_MJPEG) {
/* See https://bugzilla.gnome.org/show_bug.cgi?id=753257 */
spice_debug("removing the pipeline clock");
gst_pipeline_use_clock(GST_PIPELINE(encoder->pipeline), NULL);
- break;
- case SPICE_VIDEO_CODEC_TYPE_VP8:
- g_object_set(G_OBJECT(encoder->gstenc),
- "target-bitrate", (gint)encoder->video_bit_rate,
- NULL);
- break;
- case SPICE_VIDEO_CODEC_TYPE_H264:
- g_object_set(G_OBJECT(encoder->gstenc),
- "bitrate", (guint)(encoder->bit_rate / 1024),
- NULL);
- break;
- default:
- /* gstreamer_encoder_new() should have rejected this codec type */
- spice_warning("unsupported codec type %d", encoder->base.codec_type);
- free_pipeline(encoder);
- return FALSE;
}

/* Set the source caps */
@@ -1211,6 +1247,17 @@ static int push_raw_frame(SpiceGstEncoder *encoder,
static int pull_compressed_buffer(SpiceGstEncoder *encoder,
VideoBuffer **outbuf)
{
+#ifdef HAVE_GSTREAMER_0_10
+ SpiceGstVideoBuffer *buffer = create_gst_video_buffer();
+ buffer->gst_buffer = gst_app_sink_pull_buffer(encoder->appsink);
+ if (buffer->gst_buffer) {
+ buffer->base.data = GST_BUFFER_DATA(buffer->gst_buffer);
+ buffer->base.size = GST_BUFFER_SIZE(buffer->gst_buffer);
+ *outbuf = (VideoBuffer*)buffer;
+ return VIDEO_ENCODER_FRAME_ENCODE_DONE;
+ }
+ buffer->base.free((VideoBuffer*)buffer);
+#else
GstSample *sample = gst_app_sink_pull_sample(encoder->appsink);
if (sample) {
SpiceGstVideoBuffer *buffer = create_gst_video_buffer();
@@ -1227,6 +1274,7 @@ static int pull_compressed_buffer(SpiceGstEncoder *encoder,
buffer->base.free((VideoBuffer*)buffer);
gst_sample_unref(sample);
}
+#endif
spice_debug("failed to pull the compressed buffer");
return VIDEO_ENCODER_FRAME_UNSUPPORTED;
}
--
2.8.0.rc3
Francois Gouget
2016-04-05 15:16:28 UTC
Permalink
Signed-off-by: Francois Gouget <***@codeweavers.com>
---
examples/spiceqxl.xorg.conf.example | 7 +++++++
src/qxl.h | 1 +
src/qxl_driver.c | 2 ++
src/spiceqxl_spice_server.c | 15 +++++++++++++++
4 files changed, 25 insertions(+)

diff --git a/examples/spiceqxl.xorg.conf.example b/examples/spiceqxl.xorg.conf.example
index ec6321e..b6f4840 100644
--- a/examples/spiceqxl.xorg.conf.example
+++ b/examples/spiceqxl.xorg.conf.example
@@ -52,6 +52,13 @@ Section "Device"
# default: filter
#Option "SpiceStreamingVideo" ""

+ # Set video codecs to use. Provide a semicolon list of
+ # codecs, in preference order. Each codec requires an encoder
+ # which can be one of spice or gstreamer, and then a codec type,
+ # for instance mjpeg or vp8. The default is spice:mjpeg,
+ # which uses the builtin mjpeg encoder.
+ #Option "SpiceVideoCodecs" ""
+
# Set zlib glz wan compression. Options are auto, never, always.
# default: auto
#Option "SpiceZlibGlzWanCompression" ""
diff --git a/src/qxl.h b/src/qxl.h
index ff55604..5cc8d05 100644
--- a/src/qxl.h
+++ b/src/qxl.h
@@ -158,6 +158,7 @@ enum {
OPTION_SURFACE_BUFFER_SIZE,
OPTION_COMMAND_BUFFER_SIZE,
OPTION_SPICE_SMARTCARD_FILE,
+ OPTION_SPICE_VIDEO_CODECS,
#endif
OPTION_COUNT,
};
diff --git a/src/qxl_driver.c b/src/qxl_driver.c
index e21addd..fc1b629 100644
--- a/src/qxl_driver.c
+++ b/src/qxl_driver.c
@@ -154,6 +154,8 @@ const OptionInfoRec DefaultOptions[] =
"CommandBufferSize", OPTV_INTEGER, {DEFAULT_COMMAND_BUFFER_SIZE}, FALSE},
{ OPTION_SPICE_SMARTCARD_FILE,
"SpiceSmartcardFile", OPTV_STRING, {0}, FALSE},
+ { OPTION_SPICE_VIDEO_CODECS,
+ "SpiceVideoCodecs", OPTV_STRING, {0}, FALSE},
#endif

{ -1, NULL, OPTV_NONE, {0}, FALSE }
diff --git a/src/spiceqxl_spice_server.c b/src/spiceqxl_spice_server.c
index b2b31ff..15b0531 100644
--- a/src/spiceqxl_spice_server.c
+++ b/src/spiceqxl_spice_server.c
@@ -173,6 +173,9 @@ void xspice_set_spice_server_options(OptionInfoPtr options)
const char *streaming_video =
get_str_option(options, OPTION_SPICE_STREAMING_VIDEO,
"XSPICE_STREAMING_VIDEO");
+ const char *video_codecs =
+ get_str_option(options, OPTION_SPICE_VIDEO_CODECS,
+ "XSPICE_VIDEO_CODECS");
int agent_mouse =
get_bool_option(options, OPTION_SPICE_AGENT_MOUSE,
"XSPICE_AGENT_MOUSE");
@@ -294,6 +297,18 @@ void xspice_set_spice_server_options(OptionInfoPtr options)
spice_server_set_streaming_video(spice_server, streaming_video_opt);
}

+ if (video_codecs) {
+#if SPICE_SERVER_VERSION >= 0x000c06 /* 0.12.6 */
+ if (spice_server_set_video_codecs(spice_server, video_codecs)) {
+ fprintf(stderr, "spice: invalid video encoder %s\n", video_codecs);
+ exit(1);
+ }
+#else
+ fprintf(stderr, "spice: video_codecs are not available (spice >= 0.12.6 required)\n");
+ exit(1);
+#endif
+ }
+
spice_server_set_agent_mouse(spice_server, agent_mouse);
spice_server_set_playback_compression(spice_server, playback_compression);
--
2.8.0.rc3
Francois Gouget
2016-04-05 15:16:37 UTC
Permalink
Signed-off-by: Francois Gouget <***@codeweavers.com>
---
scripts/Xspice | 3 ++-
1 file changed, 2 insertions(+), 1 deletion(-)

diff --git a/scripts/Xspice b/scripts/Xspice
index 15a5a5e..bf8112f 100755
--- a/scripts/Xspice
+++ b/scripts/Xspice
@@ -87,6 +87,7 @@ parser.add_argument('--zlib-glz-wan-compression',
# TODO - sound support
parser.add_argument('--streaming-video', choices=['off', 'all', 'filter'],
help='filter by default')
+parser.add_argument('--video-codecs', help="Sets a semicolon-separated list of preferred video codecs to use. Each takes the form encoder:codec, with spice:mjpeg being the default and other options being provided by gstreamer for the mjpeg, vp8 and h264 codecs.")
add_boolean('--ipv4-only')
add_boolean('--ipv6-only')
parser.add_argument('--vdagent', action='store_true', dest='vdagent_enabled', default=False, help='launch vdagent & vdagentd. They provide clipboard & resolution automation')
@@ -282,7 +283,7 @@ var_args = ['port', 'tls_port', 'disable_ticketing',
'x509_key_file', 'x509_key_password',
'tls_ciphers', 'dh_file', 'password', 'image_compression',
'jpeg_wan_compression', 'zlib_glz_wan_compression',
- 'streaming_video', 'deferred_fps', 'exit_on_disconnect',
+ 'streaming_video', 'video_codecs', 'deferred_fps', 'exit_on_disconnect',
'vdagent_enabled', 'vdagent_virtio_path', 'vdagent_uinput_path',
'vdagent_uid', 'vdagent_gid']
--
2.8.0.rc3
Francois Gouget
2016-04-05 15:16:46 UTC
Permalink
This replaces the original channel-display-mjpeg API with a VideoDecoder
base class which can be reimplemented by other decoders.
Furthermore this moves the MJPEG-specific state information from the
display_stream struct to a derived class of VideoDecoder.

Signed-off-by: Francois Gouget <***@codeweavers.com>
---
src/channel-display-mjpeg.c | 131 ++++++++++++++++++++++++++++----------------
src/channel-display-priv.h | 53 +++++++++++++-----
src/channel-display.c | 63 ++++++++++-----------
3 files changed, 151 insertions(+), 96 deletions(-)

diff --git a/src/channel-display-mjpeg.c b/src/channel-display-mjpeg.c
index 95d5b33..c7e1c6f 100644
--- a/src/channel-display-mjpeg.c
+++ b/src/channel-display-mjpeg.c
@@ -23,12 +23,33 @@

#include "channel-display-priv.h"

+
+/* MJpeg decoder implementation */
+
+typedef struct MJpegDecoder {
+ VideoDecoder base;
+
+ /* ---------- The builtin mjpeg decoder ---------- */
+
+ SpiceMsgIn *frame_msg;
+ struct jpeg_source_mgr mjpeg_src;
+ struct jpeg_decompress_struct mjpeg_cinfo;
+ struct jpeg_error_mgr mjpeg_jerr;
+
+ /* ---------- Output frame data ---------- */
+
+ uint8_t *out_frame;
+} MJpegDecoder;
+
+
+/* ---------- The JPEG library callbacks ---------- */
+
static void mjpeg_src_init(struct jpeg_decompress_struct *cinfo)
{
- display_stream *st = SPICE_CONTAINEROF(cinfo->src, display_stream, mjpeg_src);
- uint8_t *data;
+ MJpegDecoder *decoder = SPICE_CONTAINEROF(cinfo->src, MJpegDecoder, mjpeg_src);

- cinfo->src->bytes_in_buffer = stream_get_current_frame(st, &data);
+ uint8_t *data;
+ cinfo->src->bytes_in_buffer = spice_msg_in_frame_data(decoder->frame_msg, &data);
cinfo->src->next_input_byte = data;
}

@@ -49,68 +70,57 @@ static void mjpeg_src_term(struct jpeg_decompress_struct *cinfo)
/* nothing */
}

-G_GNUC_INTERNAL
-void stream_mjpeg_init(display_stream *st)
-{
- st->mjpeg_cinfo.err = jpeg_std_error(&st->mjpeg_jerr);
- jpeg_create_decompress(&st->mjpeg_cinfo);
-
- st->mjpeg_src.init_source = mjpeg_src_init;
- st->mjpeg_src.fill_input_buffer = mjpeg_src_fill;
- st->mjpeg_src.skip_input_data = mjpeg_src_skip;
- st->mjpeg_src.resync_to_restart = jpeg_resync_to_restart;
- st->mjpeg_src.term_source = mjpeg_src_term;
- st->mjpeg_cinfo.src = &st->mjpeg_src;
-}

-G_GNUC_INTERNAL
-void stream_mjpeg_data(display_stream *st)
+/* ---------- VideoDecoder's public API ---------- */
+
+static uint8_t* mjpeg_decoder_decode_frame(VideoDecoder *video_decoder,
+ SpiceMsgIn *frame_msg)
{
- gboolean back_compat = st->channel->priv->peer_hdr.major_version == 1;
+ MJpegDecoder *decoder = (MJpegDecoder*)video_decoder;
+ gboolean back_compat = decoder->base.stream->channel->priv->peer_hdr.major_version == 1;
int width;
int height;
uint8_t *dest;
uint8_t *lines[4];

- stream_get_dimensions(st, &width, &height);
- dest = g_malloc0(width * height * 4);
+ decoder->frame_msg = frame_msg;
+ stream_get_dimensions(decoder->base.stream, frame_msg, &width, &height);
+ g_free(decoder->out_frame);
+ dest = decoder->out_frame = g_malloc0(width * height * 4);

- g_free(st->out_frame);
- st->out_frame = dest;
-
- jpeg_read_header(&st->mjpeg_cinfo, 1);
+ jpeg_read_header(&decoder->mjpeg_cinfo, 1);
#ifdef JCS_EXTENSIONS
// requires jpeg-turbo
if (back_compat)
- st->mjpeg_cinfo.out_color_space = JCS_EXT_RGBX;
+ decoder->mjpeg_cinfo.out_color_space = JCS_EXT_RGBX;
else
- st->mjpeg_cinfo.out_color_space = JCS_EXT_BGRX;
+ decoder->mjpeg_cinfo.out_color_space = JCS_EXT_BGRX;
#else
#warning "You should consider building with libjpeg-turbo"
- st->mjpeg_cinfo.out_color_space = JCS_RGB;
+ decoder->mjpeg_cinfo.out_color_space = JCS_RGB;
#endif

#ifndef SPICE_QUALITY
- st->mjpeg_cinfo.dct_method = JDCT_IFAST;
- st->mjpeg_cinfo.do_fancy_upsampling = FALSE;
- st->mjpeg_cinfo.do_block_smoothing = FALSE;
- st->mjpeg_cinfo.dither_mode = JDITHER_ORDERED;
+ decoder->mjpeg_cinfo.dct_method = JDCT_IFAST;
+ decoder->mjpeg_cinfo.do_fancy_upsampling = FALSE;
+ decoder->mjpeg_cinfo.do_block_smoothing = FALSE;
+ decoder->mjpeg_cinfo.dither_mode = JDITHER_ORDERED;
#endif
// TODO: in theory should check cinfo.output_height match with our height
- jpeg_start_decompress(&st->mjpeg_cinfo);
+ jpeg_start_decompress(&decoder->mjpeg_cinfo);
/* rec_outbuf_height is the recommended size of the output buffer we
* pass to libjpeg for optimum performance
*/
- if (st->mjpeg_cinfo.rec_outbuf_height > G_N_ELEMENTS(lines)) {
- jpeg_abort_decompress(&st->mjpeg_cinfo);
- g_return_if_reached();
+ if (decoder->mjpeg_cinfo.rec_outbuf_height > G_N_ELEMENTS(lines)) {
+ jpeg_abort_decompress(&decoder->mjpeg_cinfo);
+ g_return_val_if_reached(NULL);
}

- while (st->mjpeg_cinfo.output_scanline < st->mjpeg_cinfo.output_height) {
+ while (decoder->mjpeg_cinfo.output_scanline < decoder->mjpeg_cinfo.output_height) {
/* only used when JCS_EXTENSIONS is undefined */
G_GNUC_UNUSED unsigned int lines_read;

- for (unsigned int j = 0; j < st->mjpeg_cinfo.rec_outbuf_height; j++) {
+ for (unsigned int j = 0; j < decoder->mjpeg_cinfo.rec_outbuf_height; j++) {
lines[j] = dest;
#ifdef JCS_EXTENSIONS
dest += 4 * width;
@@ -118,8 +128,8 @@ void stream_mjpeg_data(display_stream *st)
dest += 3 * width;
#endif
}
- lines_read = jpeg_read_scanlines(&st->mjpeg_cinfo, lines,
- st->mjpeg_cinfo.rec_outbuf_height);
+ lines_read = jpeg_read_scanlines(&decoder->mjpeg_cinfo, lines,
+ decoder->mjpeg_cinfo.rec_outbuf_height);
#ifndef JCS_EXTENSIONS
{
uint8_t *s = lines[0];
@@ -142,15 +152,44 @@ void stream_mjpeg_data(display_stream *st)
}
}
#endif
- dest = &st->out_frame[st->mjpeg_cinfo.output_scanline * width * 4];
+ dest = &(decoder->out_frame[decoder->mjpeg_cinfo.output_scanline * width * 4]);
}
- jpeg_finish_decompress(&st->mjpeg_cinfo);
+ jpeg_finish_decompress(&decoder->mjpeg_cinfo);
+
+ return decoder->out_frame;
+}
+
+static void mjpeg_decoder_destroy(VideoDecoder* video_decoder)
+{
+ MJpegDecoder *decoder = (MJpegDecoder*)video_decoder;
+ jpeg_destroy_decompress(&decoder->mjpeg_cinfo);
+ g_free(decoder->out_frame);
+ free(decoder);
}

G_GNUC_INTERNAL
-void stream_mjpeg_cleanup(display_stream *st)
+VideoDecoder* create_mjpeg_decoder(int codec_type, display_stream *stream)
{
- jpeg_destroy_decompress(&st->mjpeg_cinfo);
- g_free(st->out_frame);
- st->out_frame = NULL;
+ g_return_val_if_fail(codec_type == SPICE_VIDEO_CODEC_TYPE_MJPEG, NULL);
+
+ MJpegDecoder *decoder = spice_new0(MJpegDecoder, 1);
+
+ decoder->base.destroy = mjpeg_decoder_destroy;
+ decoder->base.decode_frame = mjpeg_decoder_decode_frame;
+ decoder->base.codec_type = codec_type;
+ decoder->base.stream = stream;
+
+ decoder->mjpeg_cinfo.err = jpeg_std_error(&decoder->mjpeg_jerr);
+ jpeg_create_decompress(&decoder->mjpeg_cinfo);
+
+ decoder->mjpeg_src.init_source = mjpeg_src_init;
+ decoder->mjpeg_src.fill_input_buffer = mjpeg_src_fill;
+ decoder->mjpeg_src.skip_input_data = mjpeg_src_skip;
+ decoder->mjpeg_src.resync_to_restart = jpeg_resync_to_restart;
+ decoder->mjpeg_src.term_source = mjpeg_src_term;
+ decoder->mjpeg_cinfo.src = &decoder->mjpeg_src;
+
+ /* All the other fields are initialized to zero by spice_new0(). */
+
+ return (VideoDecoder*)decoder;
}
diff --git a/src/channel-display-priv.h b/src/channel-display-priv.h
index f92477b..7f1c520 100644
--- a/src/channel-display-priv.h
+++ b/src/channel-display-priv.h
@@ -34,6 +34,39 @@

G_BEGIN_DECLS

+typedef struct display_stream display_stream;
+
+typedef struct VideoDecoder VideoDecoder;
+struct VideoDecoder {
+ /* Releases the video decoder's resources */
+ void (*destroy)(VideoDecoder *decoder);
+
+ /* Decompresses the specified frame.
+ *
+ * @decoder: The video decoder.
+ * @frame_msg: The Spice message containing the compressed frame.
+ * @return: A pointer to the buffer holding the decoded frame. This
+ * buffer will be invalidated by the next call to
+ * decode_frame().
+ */
+ uint8_t* (*decode_frame)(VideoDecoder *decoder, SpiceMsgIn *frame_msg);
+
+ /* The format of the encoded video. */
+ int codec_type;
+
+ /* The associated display stream. */
+ display_stream *stream;
+};
+
+
+/* Instantiates the video decoder for the specified codec.
+ *
+ * @codec_type: The format of the video.
+ * @stream: The associated video stream.
+ * @return: A pointer to a structure implementing the VideoDecoder methods.
+ */
+VideoDecoder* create_mjpeg_decoder(int codec_type, display_stream *stream);
+

typedef struct display_surface {
guint32 surface_id;
@@ -53,24 +86,18 @@ typedef struct drops_sequence_stats {
uint32_t duration;
} drops_sequence_stats;

-typedef struct display_stream {
+struct display_stream {
SpiceMsgIn *msg_create;
SpiceMsgIn *msg_clip;
- SpiceMsgIn *msg_data;

/* from messages */
display_surface *surface;
SpiceClip *clip;
QRegion region;
int have_region;
- int codec;

- /* mjpeg decoder */
- struct jpeg_source_mgr mjpeg_src;
- struct jpeg_decompress_struct mjpeg_cinfo;
- struct jpeg_error_mgr mjpeg_jerr;
+ VideoDecoder *video_decoder;

- uint8_t *out_frame;
GQueue *msgq;
guint timeout;
SpiceChannel *channel;
@@ -97,15 +124,11 @@ typedef struct display_stream {
uint32_t report_num_frames;
uint32_t report_num_drops;
uint32_t report_drops_seq_len;
-} display_stream;
+};

-void stream_get_dimensions(display_stream *st, int *width, int *height);
-uint32_t stream_get_current_frame(display_stream *st, uint8_t **data);
+void stream_get_dimensions(display_stream *st, SpiceMsgIn *frame_msg, int *width, int *height);
+uint32_t spice_msg_in_frame_data(SpiceMsgIn *frame_msg, uint8_t **data);

-/* channel-display-mjpeg.c */
-void stream_mjpeg_init(display_stream *st);
-void stream_mjpeg_data(display_stream *st);
-void stream_mjpeg_cleanup(display_stream *st);

G_END_DECLS

diff --git a/src/channel-display.c b/src/channel-display.c
index 431c2e7..50b5bd2 100644
--- a/src/channel-display.c
+++ b/src/channel-display.c
@@ -1089,7 +1089,6 @@ static void display_handle_stream_create(SpiceChannel *channel, SpiceMsgIn *in)
st->msg_create = in;
spice_msg_in_ref(in);
st->clip = &op->clip;
- st->codec = op->codec_type;
st->surface = find_surface(c, op->surface_id);
st->msgq = g_queue_new();
st->channel = channel;
@@ -1098,10 +1097,15 @@ static void display_handle_stream_create(SpiceChannel *channel, SpiceMsgIn *in)
region_init(&st->region);
display_update_stream_region(st);

- switch (st->codec) {
+ switch (op->codec_type) {
case SPICE_VIDEO_CODEC_TYPE_MJPEG:
- stream_mjpeg_init(st);
+ st->video_decoder = create_mjpeg_decoder(op->codec_type, st);
break;
+ default:
+ st->video_decoder = NULL;
+ }
+ if (st->video_decoder == NULL) {
+ spice_printerr("could not create a video decoder for codec %d", op->codec_type);
}
}

@@ -1144,15 +1148,15 @@ static gboolean display_stream_schedule(display_stream *st)
return FALSE;
}

-static SpiceRect *stream_get_dest(display_stream *st)
+static SpiceRect *stream_get_dest(display_stream *st, SpiceMsgIn *frame_msg)
{
- if (st->msg_data == NULL ||
- spice_msg_in_type(st->msg_data) != SPICE_MSG_DISPLAY_STREAM_DATA_SIZED) {
+ if (frame_msg == NULL ||
+ spice_msg_in_type(frame_msg) != SPICE_MSG_DISPLAY_STREAM_DATA_SIZED) {
SpiceMsgDisplayStreamCreate *info = spice_msg_in_parsed(st->msg_create);

return &info->dest;
} else {
- SpiceMsgDisplayStreamDataSized *op = spice_msg_in_parsed(st->msg_data);
+ SpiceMsgDisplayStreamDataSized *op = spice_msg_in_parsed(frame_msg);

return &op->dest;
}
@@ -1167,21 +1171,16 @@ static uint32_t stream_get_flags(display_stream *st)
}

G_GNUC_INTERNAL
-uint32_t stream_get_current_frame(display_stream *st, uint8_t **data)
+uint32_t spice_msg_in_frame_data(SpiceMsgIn *frame_msg, uint8_t **data)
{
- if (st->msg_data == NULL) {
- *data = NULL;
- return 0;
- }
-
- switch (spice_msg_in_type(st->msg_data)) {
+ switch (spice_msg_in_type(frame_msg)) {
case SPICE_MSG_DISPLAY_STREAM_DATA: {
- SpiceMsgDisplayStreamData *op = spice_msg_in_parsed(st->msg_data);
+ SpiceMsgDisplayStreamData *op = spice_msg_in_parsed(frame_msg);
*data = op->data;
return op->data_size;
}
case SPICE_MSG_DISPLAY_STREAM_DATA_SIZED: {
- SpiceMsgDisplayStreamDataSized *op = spice_msg_in_parsed(st->msg_data);
+ SpiceMsgDisplayStreamDataSized *op = spice_msg_in_parsed(frame_msg);
*data = op->data;
return op->data_size;
}
@@ -1192,19 +1191,19 @@ uint32_t stream_get_current_frame(display_stream *st, uint8_t **data)
}

G_GNUC_INTERNAL
-void stream_get_dimensions(display_stream *st, int *width, int *height)
+void stream_get_dimensions(display_stream *st, SpiceMsgIn *frame_msg, int *width, int *height)
{
g_return_if_fail(width != NULL);
g_return_if_fail(height != NULL);

- if (st->msg_data == NULL ||
- spice_msg_in_type(st->msg_data) != SPICE_MSG_DISPLAY_STREAM_DATA_SIZED) {
+ if (frame_msg == NULL ||
+ spice_msg_in_type(frame_msg) != SPICE_MSG_DISPLAY_STREAM_DATA_SIZED) {
SpiceMsgDisplayStreamCreate *info = spice_msg_in_parsed(st->msg_create);

*width = info->stream_width;
*height = info->stream_height;
} else {
- SpiceMsgDisplayStreamDataSized *op = spice_msg_in_parsed(st->msg_data);
+ SpiceMsgDisplayStreamDataSized *op = spice_msg_in_parsed(frame_msg);

*width = op->width;
*height = op->height;
@@ -1222,24 +1221,21 @@ static gboolean display_stream_render(display_stream *st)

g_return_val_if_fail(in != NULL, FALSE);

- st->msg_data = in;
- switch (st->codec) {
- case SPICE_VIDEO_CODEC_TYPE_MJPEG:
- stream_mjpeg_data(st);
- break;
+ uint8_t *out_frame = NULL;
+ if (st->video_decoder) {
+ out_frame = st->video_decoder->decode_frame(st->video_decoder, in);
}
-
- if (st->out_frame) {
+ if (out_frame) {
int width;
int height;
SpiceRect *dest;
uint8_t *data;
int stride;

- stream_get_dimensions(st, &width, &height);
- dest = stream_get_dest(st);
+ stream_get_dimensions(st, in, &width, &height);
+ dest = stream_get_dest(st, in);

- data = st->out_frame;
+ data = out_frame;
stride = width * sizeof(uint32_t);
if (!(stream_get_flags(st) & SPICE_STREAM_FLAGS_TOP_DOWN)) {
data += stride * (height - 1);
@@ -1262,7 +1258,6 @@ static gboolean display_stream_render(display_stream *st)
dest->bottom - dest->top);
}

- st->msg_data = NULL;
spice_msg_in_unref(in);

in = g_queue_peek_head(st->msgq);
@@ -1571,10 +1566,8 @@ static void destroy_stream(SpiceChannel *channel, int id)

g_array_free(st->drops_seqs_stats_arr, TRUE);

- switch (st->codec) {
- case SPICE_VIDEO_CODEC_TYPE_MJPEG:
- stream_mjpeg_cleanup(st);
- break;
+ if (st->video_decoder) {
+ st->video_decoder->destroy(st->video_decoder);
}

if (st->msg_clip)
--
2.8.0.rc3
Francois Gouget
2016-04-05 15:16:50 UTC
Permalink
The MJPEG decoder does not need a zero-filled buffer.

Signed-off-by: Francois Gouget <***@codeweavers.com>
---
src/channel-display-mjpeg.c | 9 +++++++--
1 file changed, 7 insertions(+), 2 deletions(-)

diff --git a/src/channel-display-mjpeg.c b/src/channel-display-mjpeg.c
index c7e1c6f..927827b 100644
--- a/src/channel-display-mjpeg.c
+++ b/src/channel-display-mjpeg.c
@@ -39,6 +39,7 @@ typedef struct MJpegDecoder {
/* ---------- Output frame data ---------- */

uint8_t *out_frame;
+ uint32_t out_size;
} MJpegDecoder;


@@ -85,8 +86,12 @@ static uint8_t* mjpeg_decoder_decode_frame(VideoDecoder *video_decoder,

decoder->frame_msg = frame_msg;
stream_get_dimensions(decoder->base.stream, frame_msg, &width, &height);
- g_free(decoder->out_frame);
- dest = decoder->out_frame = g_malloc0(width * height * 4);
+ if (decoder->out_size < width * height * 4) {
+ g_free(decoder->out_frame);
+ decoder->out_size = width * height * 4;
+ decoder->out_frame = g_malloc(decoder->out_size);
+ }
+ dest = decoder->out_frame;

jpeg_read_header(&decoder->mjpeg_cinfo, 1);
#ifdef JCS_EXTENSIONS
--
2.8.0.rc3
Francois Gouget
2016-04-05 15:16:54 UTC
Permalink
The frame may not get dropped once that's left up to video decoders.

Signed-off-by: Francois Gouget <***@codeweavers.com>
---
src/channel-display-priv.h | 2 +-
src/channel-display.c | 8 ++++----
2 files changed, 5 insertions(+), 5 deletions(-)

diff --git a/src/channel-display-priv.h b/src/channel-display-priv.h
index 7f1c520..5256ad9 100644
--- a/src/channel-display-priv.h
+++ b/src/channel-display-priv.h
@@ -104,7 +104,7 @@ struct display_stream {

/* stats */
uint32_t first_frame_mm_time;
- uint32_t num_drops_on_receive;
+ uint32_t arrive_late_count;
uint64_t arrive_late_time;
uint32_t num_drops_on_playback;
uint32_t num_input_frames;
diff --git a/src/channel-display.c b/src/channel-display.c
index 50b5bd2..adba315 100644
--- a/src/channel-display.c
+++ b/src/channel-display.c
@@ -1461,7 +1461,7 @@ static void display_handle_stream_data(SpiceChannel *channel, SpiceMsgIn *in)
CHANNEL_DEBUG(channel, "stream data too late by %u ms (ts: %u, mmtime: %u), dropping",
mmtime - op->multi_media_time, op->multi_media_time, mmtime);
st->arrive_late_time += mmtime - op->multi_media_time;
- st->num_drops_on_receive++;
+ st->arrive_late_count++;

if (!st->cur_drops_seq_stats.len) {
st->cur_drops_seq_stats.start_mm_time = op->multi_media_time;
@@ -1537,15 +1537,15 @@ static void destroy_stream(SpiceChannel *channel, int id)
if (!st)
return;

- num_out_frames = st->num_input_frames - st->num_drops_on_receive - st->num_drops_on_playback;
+ num_out_frames = st->num_input_frames - st->arrive_late_count - st->num_drops_on_playback;
CHANNEL_DEBUG(channel, "%s: id=%d #in-frames=%d out/in=%.2f "
"#drops-on-receive=%d avg-late-time(ms)=%.2f "
"#drops-on-playback=%d", __FUNCTION__,
id,
st->num_input_frames,
num_out_frames / (double)st->num_input_frames,
- st->num_drops_on_receive,
- st->num_drops_on_receive ? st->arrive_late_time / ((double)st->num_drops_on_receive): 0,
+ st->arrive_late_count,
+ st->arrive_late_count ? st->arrive_late_time / ((double)st->arrive_late_count): 0,
st->num_drops_on_playback);
if (st->num_drops_seqs) {
CHANNEL_DEBUG(channel, "%s: #drops-sequences=%u ==>", __FUNCTION__, st->num_drops_seqs);
--
2.8.0.rc3
Francois Gouget
2016-04-05 15:16:58 UTC
Permalink
Signed-off-by: Francois Gouget <***@codeweavers.com>
---
src/channel-display-mjpeg.c | 142 ++++++++++++++++++++++++++++---
src/channel-display-priv.h | 10 ++-
src/channel-display.c | 201 +++++++++++---------------------------------
3 files changed, 189 insertions(+), 164 deletions(-)

diff --git a/src/channel-display-mjpeg.c b/src/channel-display-mjpeg.c
index 927827b..f3f4ceb 100644
--- a/src/channel-display-mjpeg.c
+++ b/src/channel-display-mjpeg.c
@@ -31,11 +31,16 @@ typedef struct MJpegDecoder {

/* ---------- The builtin mjpeg decoder ---------- */

- SpiceMsgIn *frame_msg;
struct jpeg_source_mgr mjpeg_src;
struct jpeg_decompress_struct mjpeg_cinfo;
struct jpeg_error_mgr mjpeg_jerr;

+ /* ---------- Frame queue ---------- */
+
+ GQueue *msgq;
+ SpiceMsgIn *cur_frame_msg;
+ guint timer_id;
+
/* ---------- Output frame data ---------- */

uint8_t *out_frame;
@@ -50,7 +55,7 @@ static void mjpeg_src_init(struct jpeg_decompress_struct *cinfo)
MJpegDecoder *decoder = SPICE_CONTAINEROF(cinfo->src, MJpegDecoder, mjpeg_src);

uint8_t *data;
- cinfo->src->bytes_in_buffer = spice_msg_in_frame_data(decoder->frame_msg, &data);
+ cinfo->src->bytes_in_buffer = spice_msg_in_frame_data(decoder->cur_frame_msg, &data);
cinfo->src->next_input_byte = data;
}

@@ -72,10 +77,12 @@ static void mjpeg_src_term(struct jpeg_decompress_struct *cinfo)
}


-/* ---------- VideoDecoder's public API ---------- */
+/* ---------- Decoder proper ---------- */

-static uint8_t* mjpeg_decoder_decode_frame(VideoDecoder *video_decoder,
- SpiceMsgIn *frame_msg)
+static void mjpeg_decoder_schedule(MJpegDecoder *decoder);
+
+/* main context */
+static gboolean mjpeg_decoder_decode_frame(gpointer video_decoder)
{
MJpegDecoder *decoder = (MJpegDecoder*)video_decoder;
gboolean back_compat = decoder->base.stream->channel->priv->peer_hdr.major_version == 1;
@@ -84,8 +91,7 @@ static uint8_t* mjpeg_decoder_decode_frame(VideoDecoder *video_decoder,
uint8_t *dest;
uint8_t *lines[4];

- decoder->frame_msg = frame_msg;
- stream_get_dimensions(decoder->base.stream, frame_msg, &width, &height);
+ stream_get_dimensions(decoder->base.stream, decoder->cur_frame_msg, &width, &height);
if (decoder->out_size < width * height * 4) {
g_free(decoder->out_frame);
decoder->out_size = width * height * 4;
@@ -118,7 +124,7 @@ static uint8_t* mjpeg_decoder_decode_frame(VideoDecoder *video_decoder,
*/
if (decoder->mjpeg_cinfo.rec_outbuf_height > G_N_ELEMENTS(lines)) {
jpeg_abort_decompress(&decoder->mjpeg_cinfo);
- g_return_val_if_reached(NULL);
+ g_return_val_if_reached(G_SOURCE_REMOVE);
}

while (decoder->mjpeg_cinfo.output_scanline < decoder->mjpeg_cinfo.output_height) {
@@ -161,12 +167,125 @@ static uint8_t* mjpeg_decoder_decode_frame(VideoDecoder *video_decoder,
}
jpeg_finish_decompress(&decoder->mjpeg_cinfo);

- return decoder->out_frame;
+ /* Display the frame and dispose of it */
+ stream_display_frame(decoder->base.stream, decoder->cur_frame_msg, decoder->out_frame);
+ spice_msg_in_unref(decoder->cur_frame_msg);
+ decoder->cur_frame_msg = NULL;
+ decoder->timer_id = 0;
+
+ /* Schedule the next frame */
+ mjpeg_decoder_schedule(decoder);
+
+ return G_SOURCE_REMOVE;
+}
+
+/* ---------- VideoDecoder's queue scheduling ---------- */
+
+static void mjpeg_decoder_schedule(MJpegDecoder *decoder)
+{
+ SPICE_DEBUG("%s", __FUNCTION__);
+ if (decoder->timer_id) {
+ return;
+ }
+
+ guint32 time = stream_get_time(decoder->base.stream);
+ SpiceMsgIn *frame_msg = decoder->cur_frame_msg;
+ decoder->cur_frame_msg = NULL;
+ do {
+ if (frame_msg) {
+ SpiceStreamDataHeader *op = spice_msg_in_parsed(frame_msg);
+ if (time <= op->multi_media_time) {
+ guint32 d = op->multi_media_time - time;
+ decoder->cur_frame_msg = frame_msg;
+ decoder->timer_id = g_timeout_add(d, mjpeg_decoder_decode_frame, decoder);
+ break;
+ }
+
+ SPICE_DEBUG("%s: rendering too late by %u ms (ts: %u, mmtime: %u), dropping ",
+ __FUNCTION__, time - op->multi_media_time,
+ op->multi_media_time, time);
+ stream_dropped_frame(decoder->base.stream);
+ spice_msg_in_unref(frame_msg);
+ }
+ frame_msg = g_queue_pop_head(decoder->msgq);
+ } while (frame_msg);
+}
+
+
+/* mjpeg_decoder_drop_queue() helper */
+static void _msg_in_unref_func(gpointer data, gpointer user_data)
+{
+ spice_msg_in_unref(data);
+}
+
+static void mjpeg_decoder_drop_queue(MJpegDecoder *decoder)
+{
+ if (decoder->timer_id != 0) {
+ g_source_remove(decoder->timer_id);
+ decoder->timer_id = 0;
+ }
+ if (decoder->cur_frame_msg) {
+ spice_msg_in_unref(decoder->cur_frame_msg);
+ decoder->cur_frame_msg = NULL;
+ }
+ g_queue_foreach(decoder->msgq, _msg_in_unref_func, NULL);
+ g_queue_clear(decoder->msgq);
+}
+
+/* ---------- VideoDecoder's public API ---------- */
+
+static void mjpeg_decoder_queue_frame(VideoDecoder *video_decoder,
+ SpiceMsgIn *frame_msg, int32_t latency)
+{
+ MJpegDecoder *decoder = (MJpegDecoder*)video_decoder;
+ SpiceMsgIn *last_msg;
+
+ SPICE_DEBUG("%s", __FUNCTION__);
+
+ last_msg = g_queue_peek_tail(decoder->msgq);
+ if (last_msg) {
+ SpiceStreamDataHeader *last_op, *frame_op;
+ last_op = spice_msg_in_parsed(last_msg);
+ frame_op = spice_msg_in_parsed(frame_msg);
+ if (frame_op->multi_media_time < last_op->multi_media_time) {
+ /* This should really not happen */
+ SPICE_DEBUG("new-frame-time < last-frame-time (%u < %u):"
+ " resetting stream, id %d",
+ frame_op->multi_media_time,
+ last_op->multi_media_time, frame_op->id);
+ mjpeg_decoder_drop_queue(decoder);
+ }
+ }
+
+ /* Dropped MJPEG frames don't impact the ones that come after.
+ * So drop late frames as early as possible to save on processing time.
+ */
+ if (latency < 0) {
+ return;
+ }
+
+ spice_msg_in_ref(frame_msg);
+ g_queue_push_tail(decoder->msgq, frame_msg);
+ mjpeg_decoder_schedule(decoder);
+}
+
+static void mjpeg_decoder_reschedule(VideoDecoder *video_decoder)
+{
+ MJpegDecoder *decoder = (MJpegDecoder*)video_decoder;
+
+ SPICE_DEBUG("%s", __FUNCTION__);
+ if (decoder->timer_id != 0) {
+ g_source_remove(decoder->timer_id);
+ decoder->timer_id = 0;
+ }
+ mjpeg_decoder_schedule(decoder);
}

static void mjpeg_decoder_destroy(VideoDecoder* video_decoder)
{
MJpegDecoder *decoder = (MJpegDecoder*)video_decoder;
+
+ mjpeg_decoder_drop_queue(decoder);
jpeg_destroy_decompress(&decoder->mjpeg_cinfo);
g_free(decoder->out_frame);
free(decoder);
@@ -180,10 +299,13 @@ VideoDecoder* create_mjpeg_decoder(int codec_type, display_stream *stream)
MJpegDecoder *decoder = spice_new0(MJpegDecoder, 1);

decoder->base.destroy = mjpeg_decoder_destroy;
- decoder->base.decode_frame = mjpeg_decoder_decode_frame;
+ decoder->base.reschedule = mjpeg_decoder_reschedule;
+ decoder->base.queue_frame = mjpeg_decoder_queue_frame;
decoder->base.codec_type = codec_type;
decoder->base.stream = stream;

+ decoder->msgq = g_queue_new();
+
decoder->mjpeg_cinfo.err = jpeg_std_error(&decoder->mjpeg_jerr);
jpeg_create_decompress(&decoder->mjpeg_cinfo);

diff --git a/src/channel-display-priv.h b/src/channel-display-priv.h
index 5256ad9..92cba50 100644
--- a/src/channel-display-priv.h
+++ b/src/channel-display-priv.h
@@ -41,6 +41,9 @@ struct VideoDecoder {
/* Releases the video decoder's resources */
void (*destroy)(VideoDecoder *decoder);

+ /* Notifies the decoder that the mm-time clock changed. */
+ void (*reschedule)(VideoDecoder *video_decoder);
+
/* Decompresses the specified frame.
*
* @decoder: The video decoder.
@@ -49,7 +52,7 @@ struct VideoDecoder {
* buffer will be invalidated by the next call to
* decode_frame().
*/
- uint8_t* (*decode_frame)(VideoDecoder *decoder, SpiceMsgIn *frame_msg);
+ void (*queue_frame)(VideoDecoder *decoder, SpiceMsgIn *frame_msg, int32_t latency);

/* The format of the encoded video. */
int codec_type;
@@ -98,8 +101,6 @@ struct display_stream {

VideoDecoder *video_decoder;

- GQueue *msgq;
- guint timeout;
SpiceChannel *channel;

/* stats */
@@ -127,6 +128,9 @@ struct display_stream {
};

void stream_get_dimensions(display_stream *st, SpiceMsgIn *frame_msg, int *width, int *height);
+guint32 stream_get_time(display_stream *st);
+void stream_dropped_frame(display_stream *st);
+void stream_display_frame(display_stream *st, SpiceMsgIn *frame_msg, uint8_t* data);
uint32_t spice_msg_in_frame_data(SpiceMsgIn *frame_msg, uint8_t **data);


diff --git a/src/channel-display.c b/src/channel-display.c
index adba315..e591add 100644
--- a/src/channel-display.c
+++ b/src/channel-display.c
@@ -106,11 +106,9 @@ static void channel_set_handlers(SpiceChannelClass *klass);
static void clear_surfaces(SpiceChannel *channel, gboolean keep_primary);
static void clear_streams(SpiceChannel *channel);
static display_surface *find_surface(SpiceDisplayChannelPrivate *c, guint32 surface_id);
-static gboolean display_stream_render(display_stream *st);
static void spice_display_channel_reset(SpiceChannel *channel, gboolean migrating);
static void spice_display_channel_reset_capabilities(SpiceChannel *channel);
static void destroy_canvas(display_surface *surface);
-static void _msg_in_unref_func(gpointer data, gpointer user_data);
static void display_session_mm_time_reset_cb(SpiceSession *session, gpointer data);
static SpiceGlScanout* spice_gl_scanout_copy(const SpiceGlScanout *scanout);

@@ -1090,7 +1088,6 @@ static void display_handle_stream_create(SpiceChannel *channel, SpiceMsgIn *in)
spice_msg_in_ref(in);
st->clip = &op->clip;
st->surface = find_surface(c, op->surface_id);
- st->msgq = g_queue_new();
st->channel = channel;
st->drops_seqs_stats_arr = g_array_new(FALSE, FALSE, sizeof(drops_sequence_stats));

@@ -1109,45 +1106,6 @@ static void display_handle_stream_create(SpiceChannel *channel, SpiceMsgIn *in)
}
}

-/* coroutine or main context */
-static gboolean display_stream_schedule(display_stream *st)
-{
- SpiceSession *session = spice_channel_get_session(st->channel);
- guint32 time, d;
- SpiceStreamDataHeader *op;
- SpiceMsgIn *in;
-
- SPICE_DEBUG("%s", __FUNCTION__);
- if (st->timeout || !session)
- return TRUE;
-
- time = spice_session_get_mm_time(session);
- in = g_queue_peek_head(st->msgq);
-
- if (in == NULL) {
- return TRUE;
- }
-
- op = spice_msg_in_parsed(in);
- if (time < op->multi_media_time) {
- d = op->multi_media_time - time;
- SPICE_DEBUG("scheduling next stream render in %u ms", d);
- st->timeout = g_timeout_add(d, (GSourceFunc)display_stream_render, st);
- return TRUE;
- } else {
- SPICE_DEBUG("%s: rendering too late by %u ms (ts: %u, mmtime: %u), dropping ",
- __FUNCTION__, time - op->multi_media_time,
- op->multi_media_time, time);
- in = g_queue_pop_head(st->msgq);
- spice_msg_in_unref(in);
- st->num_drops_on_playback++;
- if (g_queue_get_length(st->msgq) == 0)
- return TRUE;
- }
-
- return FALSE;
-}
-
static SpiceRect *stream_get_dest(display_stream *st, SpiceMsgIn *frame_msg)
{
if (frame_msg == NULL ||
@@ -1210,66 +1168,54 @@ void stream_get_dimensions(display_stream *st, SpiceMsgIn *frame_msg, int *width
}
}

-/* main context */
-static gboolean display_stream_render(display_stream *st)
+G_GNUC_INTERNAL
+guint32 stream_get_time(display_stream *st)
{
- SpiceMsgIn *in;
+ SpiceSession *session = spice_channel_get_session(st->channel);
+ return session ? spice_session_get_mm_time(session) : 0;
+}

- st->timeout = 0;
- do {
- in = g_queue_pop_head(st->msgq);
+/* coroutine or main context */
+G_GNUC_INTERNAL
+void stream_dropped_frame(display_stream *st)
+{
+ st->num_drops_on_playback++;
+}

- g_return_val_if_fail(in != NULL, FALSE);
+/* main context */
+G_GNUC_INTERNAL
+void stream_display_frame(display_stream *st, SpiceMsgIn *frame_msg,
+ uint8_t* data)
+{
+ int width, height;
+ SpiceRect *dest;
+ int stride;

- uint8_t *out_frame = NULL;
- if (st->video_decoder) {
- out_frame = st->video_decoder->decode_frame(st->video_decoder, in);
- }
- if (out_frame) {
- int width;
- int height;
- SpiceRect *dest;
- uint8_t *data;
- int stride;
-
- stream_get_dimensions(st, in, &width, &height);
- dest = stream_get_dest(st, in);
-
- data = out_frame;
- stride = width * sizeof(uint32_t);
- if (!(stream_get_flags(st) & SPICE_STREAM_FLAGS_TOP_DOWN)) {
- data += stride * (height - 1);
- stride = -stride;
- }
+ stream_get_dimensions(st, frame_msg, &width, &height);
+ dest = stream_get_dest(st, frame_msg);

- st->surface->canvas->ops->put_image(
- st->surface->canvas,
+ stride = width * sizeof(uint32_t);
+ if (!(stream_get_flags(st) & SPICE_STREAM_FLAGS_TOP_DOWN)) {
+ data += stride * (height - 1);
+ stride = -stride;
+ }
+
+ st->surface->canvas->ops->put_image(
+ st->surface->canvas,
#ifdef G_OS_WIN32
- SPICE_DISPLAY_CHANNEL(st->channel)->priv->dc,
+ SPICE_DISPLAY_CHANNEL(st->channel)->priv->dc,
#endif
- dest, data,
- width, height, stride,
- st->have_region ? &st->region : NULL);
-
- if (st->surface->primary)
- g_signal_emit(st->channel, signals[SPICE_DISPLAY_INVALIDATE], 0,
- dest->left, dest->top,
- dest->right - dest->left,
- dest->bottom - dest->top);
- }
+ dest, data,
+ width, height, stride,
+ st->have_region ? &st->region : NULL);

- spice_msg_in_unref(in);
-
- in = g_queue_peek_head(st->msgq);
- if (in == NULL)
- break;
-
- if (display_stream_schedule(st))
- return FALSE;
- } while (1);
-
- return FALSE;
+ if (st->surface->primary)
+ g_signal_emit(st->channel, signals[SPICE_DISPLAY_INVALIDATE], 0,
+ dest->left, dest->top,
+ dest->right - dest->left,
+ dest->bottom - dest->top);
}
+
/* after a sequence of 3 drops, push a report to the server, even
* if the report window is bigger */
#define STREAM_REPORT_DROP_SEQ_LEN_LIMIT 3
@@ -1330,17 +1276,6 @@ static void display_update_stream_report(SpiceDisplayChannel *channel, uint32_t
}
}

-static void display_stream_reset_rendering_timer(display_stream *st)
-{
- SPICE_DEBUG("%s", __FUNCTION__);
- if (st->timeout != 0) {
- g_source_remove(st->timeout);
- st->timeout = 0;
- }
- while (!display_stream_schedule(st)) {
- }
-}
-
/*
* Migration can occur between 2 spice-servers with different mm-times.
* Then, the following cases can happen after migration completes:
@@ -1370,8 +1305,9 @@ static void display_stream_reset_rendering_timer(display_stream *st)
* case 2 is less likely, since at takes at least 20 frames till the dst-server re-identifies
* the video stream and starts sending stream data
*
- * display_session_mm_time_reset_cb handles case 1.a, and
- * display_stream_test_frames_mm_time_reset handles case 2.b
+ * display_session_mm_time_reset_cb handles case 1.a by notifying the
+ * video decoders through their reschedule() method, and case 2.b is handled
+ * directly by the video decoders in their queue_frame() method
*/

/* main context */
@@ -1391,36 +1327,7 @@ static void display_session_mm_time_reset_cb(SpiceSession *session, gpointer dat
}
SPICE_DEBUG("%s: stream-id %d", __FUNCTION__, i);
st = c->streams[i];
- display_stream_reset_rendering_timer(st);
- }
-}
-
-/* coroutine context */
-static void display_stream_test_frames_mm_time_reset(display_stream *st,
- SpiceMsgIn *new_frame_msg,
- guint32 mm_time)
-{
- SpiceStreamDataHeader *tail_op, *new_op;
- SpiceMsgIn *tail_msg;
-
- SPICE_DEBUG("%s", __FUNCTION__);
- g_return_if_fail(new_frame_msg != NULL);
- tail_msg = g_queue_peek_tail(st->msgq);
- if (!tail_msg) {
- return;
- }
- tail_op = spice_msg_in_parsed(tail_msg);
- new_op = spice_msg_in_parsed(new_frame_msg);
-
- if (new_op->multi_media_time < tail_op->multi_media_time) {
- SPICE_DEBUG("new-frame-time < tail-frame-time (%u < %u):"
- " reseting stream, id %d",
- new_op->multi_media_time,
- tail_op->multi_media_time,
- new_op->id);
- g_queue_foreach(st->msgq, _msg_in_unref_func, NULL);
- g_queue_clear(st->msgq);
- display_stream_reset_rendering_timer(st);
+ st->video_decoder->reschedule(st->video_decoder);
}
}

@@ -1440,7 +1347,7 @@ static void display_handle_stream_data(SpiceChannel *channel, SpiceMsgIn *in)
g_return_if_fail(c->nstreams > op->id);

st = c->streams[op->id];
- mmtime = spice_session_get_mm_time(spice_channel_get_session(channel));
+ mmtime = stream_get_time(st);

if (spice_msg_in_type(in) == SPICE_MSG_DISPLAY_STREAM_DATA_SIZED) {
CHANNEL_DEBUG(channel, "stream %d contains sized data", op->id);
@@ -1470,11 +1377,6 @@ static void display_handle_stream_data(SpiceChannel *channel, SpiceMsgIn *in)
st->playback_sync_drops_seq_len++;
} else {
CHANNEL_DEBUG(channel, "video latency: %d", latency);
- spice_msg_in_ref(in);
- display_stream_test_frames_mm_time_reset(st, in, mmtime);
- g_queue_push_tail(st->msgq, in);
- while (!display_stream_schedule(st)) {
- }
if (st->cur_drops_seq_stats.len) {
st->cur_drops_seq_stats.duration = op->multi_media_time -
st->cur_drops_seq_stats.start_mm_time;
@@ -1484,6 +1386,12 @@ static void display_handle_stream_data(SpiceChannel *channel, SpiceMsgIn *in)
}
st->playback_sync_drops_seq_len = 0;
}
+
+ /* Let the video decoder queue the frames so it can optimize their
+ * decoding and best decide if/when to drop them when they are late,
+ * taking into account the impact on later frames.
+ */
+ st->video_decoder->queue_frame(st->video_decoder, in, latency);
if (c->enable_adaptive_streaming) {
display_update_stream_report(SPICE_DISPLAY_CHANNEL(channel), op->id,
op->multi_media_time, latency);
@@ -1516,11 +1424,6 @@ static void display_handle_stream_clip(SpiceChannel *channel, SpiceMsgIn *in)
display_update_stream_region(st);
}

-static void _msg_in_unref_func(gpointer data, gpointer user_data)
-{
- spice_msg_in_unref(data);
-}
-
static void destroy_stream(SpiceChannel *channel, int id)
{
SpiceDisplayChannelPrivate *c = SPICE_DISPLAY_CHANNEL(channel)->priv;
@@ -1574,10 +1477,6 @@ static void destroy_stream(SpiceChannel *channel, int id)
spice_msg_in_unref(st->msg_clip);
spice_msg_in_unref(st->msg_create);

- g_queue_foreach(st->msgq, _msg_in_unref_func, NULL);
- g_queue_free(st->msgq);
- if (st->timeout != 0)
- g_source_remove(st->timeout);
g_free(st);
c->streams[id] = NULL;
}
--
2.8.0.rc3
Francois Gouget
2016-04-05 15:17:02 UTC
Permalink
Signed-off-by: Francois Gouget <***@codeweavers.com>
---
configure.ac | 26 ++-
src/Makefile.am | 8 +
src/channel-display-gst.c | 437 +++++++++++++++++++++++++++++++++++++++++++++
src/channel-display-priv.h | 6 +
src/channel-display.c | 10 ++
5 files changed, 486 insertions(+), 1 deletion(-)
create mode 100644 src/channel-display-gst.c

diff --git a/configure.ac b/configure.ac
index 4227fd8..4609382 100644
--- a/configure.ac
+++ b/configure.ac
@@ -257,6 +257,29 @@ AS_IF([test "x$enable_pulse$have_gstaudio" = "xnono"],
[SPICE_WARNING([No PulseAudio or GStreamer 1.0 audio decoder, audio will not be streamed])
])

+AC_ARG_ENABLE([gstvideo],
+ AS_HELP_STRING([--enable-gstvideo=@<:@auto/yes/no@:>@],
+ [Enable GStreamer video support @<:@default=auto@:>@]),
+ [],
+ [enable_gstvideo="auto"])
+AS_IF([test "x$enable_gstvideo" != "xno"],
+ [SPICE_CHECK_GSTREAMER(GSTVIDEO, 1.0,
+ [gstreamer-1.0 gstreamer-base-1.0 gstreamer-app-1.0 gstreamer-video-1.0],
+ [missing_gstreamer_elements=""
+ SPICE_CHECK_GSTREAMER_ELEMENTS($GST_INSPECT_1_0, [gst-plugins-base 1.0], [appsrc videoconvert appsink])
+ SPICE_CHECK_GSTREAMER_ELEMENTS($GST_INSPECT_1_0, [gst-plugins-good 1.0], [jpegdec vp8dec])
+ SPICE_CHECK_GSTREAMER_ELEMENTS($GST_INSPECT_1_0, [gst-plugins-bad 1.0], [h264parse])
+ SPICE_CHECK_GSTREAMER_ELEMENTS($GST_INSPECT_1_0, [gstreamer-libav 1.0], [avdec_h264])
+ AS_IF([test x"$missing_gstreamer_elements" = "xyes"],
+ SPICE_WARNING([The GStreamer video decoder can be built but may not work.]))
+ ],
+ [AS_IF([test "x$enable_gstvideo" = "xyes"],
+ AC_MSG_ERROR([GStreamer 1.0 video requested but not found]))
+ ])
+ ], [have_gstvideo="no"]
+)
+AM_CONDITIONAL([HAVE_GSTVIDEO], [test "x$have_gstvideo" = "xyes"])
+
AC_CHECK_LIB(jpeg, jpeg_destroy_decompress,
AC_MSG_CHECKING([for jpeglib.h])
AC_TRY_CPP(
@@ -543,7 +566,7 @@ SPICE_CFLAGS="$SPICE_CFLAGS $WARN_CFLAGS"

AC_SUBST(SPICE_CFLAGS)

-SPICE_GLIB_CFLAGS="$PIXMAN_CFLAGS $PULSE_CFLAGS $GSTAUDIO_CFLAGS $GLIB2_CFLAGS $GIO_CFLAGS $GOBJECT2_CFLAGS $SSL_CFLAGS $SASL_CFLAGS"
+SPICE_GLIB_CFLAGS="$PIXMAN_CFLAGS $PULSE_CFLAGS $GSTAUDIO_CFLAGS $GSTVIDEO_CFLAGS $GLIB2_CFLAGS $GIO_CFLAGS $GOBJECT2_CFLAGS $SSL_CFLAGS $SASL_CFLAGS"
SPICE_GTK_CFLAGS="$SPICE_GLIB_CFLAGS $GTK_CFLAGS "

AC_SUBST(SPICE_GLIB_CFLAGS)
@@ -587,6 +610,7 @@ AC_MSG_NOTICE([
Coroutine: ${with_coroutine}
PulseAudio: ${enable_pulse}
GStreamer Audio: ${have_gstaudio}
+ GStreamer Video: ${have_gstvideo}
SASL support: ${have_sasl}
Smartcard support: ${have_smartcard}
USB redirection support: ${have_usbredir} ${with_usbredir_hotplug}
diff --git a/src/Makefile.am b/src/Makefile.am
index 0ef3bea..317e993 100644
--- a/src/Makefile.am
+++ b/src/Makefile.am
@@ -94,6 +94,7 @@ SPICE_COMMON_CPPFLAGS = \
$(SSL_CFLAGS) \
$(SASL_CFLAGS) \
$(GSTAUDIO_CFLAGS) \
+ $(GSTVIDEO_CFLAGS) \
$(SMARTCARD_CFLAGS) \
$(USBREDIR_CFLAGS) \
$(GUDEV_CFLAGS) \
@@ -197,6 +198,7 @@ libspice_client_glib_2_0_la_LIBADD = \
$(SSL_LIBS) \
$(PULSE_LIBS) \
$(GSTAUDIO_LIBS) \
+ $(GSTVIDEO_LIBS) \
$(SASL_LIBS) \
$(SMARTCARD_LIBS) \
$(USBREDIR_LIBS) \
@@ -328,6 +330,12 @@ libspice_client_glib_2_0_la_SOURCES += \
$(NULL)
endif

+if HAVE_GSTVIDEO
+libspice_client_glib_2_0_la_SOURCES += \
+ channel-display-gst.c \
+ $(NULL)
+endif
+
if WITH_PHODAV
libspice_client_glib_2_0_la_SOURCES += \
giopipe.c \
diff --git a/src/channel-display-gst.c b/src/channel-display-gst.c
new file mode 100644
index 0000000..95841bd
--- /dev/null
+++ b/src/channel-display-gst.c
@@ -0,0 +1,437 @@
+/* -*- Mode: C; c-basic-offset: 4; indent-tabs-mode: nil -*- */
+/*
+ Copyright (C) 2015-2016 CodeWeavers, Inc
+
+ This library is free software; you can redistribute it and/or
+ modify it under the terms of the GNU Lesser General Public
+ License as published by the Free Software Foundation; either
+ version 2.1 of the License, or (at your option) any later version.
+
+ This library is distributed in the hope that it will be useful,
+ but WITHOUT ANY WARRANTY; without even the implied warranty of
+ MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
+ Lesser General Public License for more details.
+
+ You should have received a copy of the GNU Lesser General Public
+ License along with this library; if not, see <http://www.gnu.org/licenses/>.
+*/
+#include "config.h"
+
+#include "spice-client.h"
+#include "spice-common.h"
+#include "spice-channel-priv.h"
+
+#include "channel-display-priv.h"
+
+#include <gst/gst.h>
+#include <gst/app/gstappsrc.h>
+#include <gst/app/gstappsink.h>
+
+
+/* GStreamer decoder implementation */
+
+typedef struct SpiceGstDecoder {
+ VideoDecoder base;
+
+ /* ---------- Video characteristics ---------- */
+
+ int width;
+ int height;
+ uint32_t last_mm_time;
+
+ /* ---------- GStreamer pipeline ---------- */
+
+ GstAppSrc *appsrc;
+ GstAppSink *appsink;
+ GstElement *pipeline;
+ GstClock *clock;
+
+ /* ---------- Frame and display queues ---------- */
+
+ GMutex display_mutex;
+ GQueue *display_queue;
+ GQueue *frame_meta_queue;
+ guint timer_id;
+} SpiceGstDecoder;
+
+
+/* ---------- SpiceFrameMeta ---------- */
+
+typedef struct _SpiceFrameMeta {
+ GstClockTime timestamp;
+ SpiceMsgIn *msg;
+ GstSample *sample;
+} SpiceFrameMeta;
+
+static SpiceFrameMeta *create_frame_meta(GstBuffer *buffer, SpiceMsgIn *msg)
+{
+ SpiceFrameMeta *frame_meta = spice_new(SpiceFrameMeta, 1);
+ frame_meta->timestamp = GST_BUFFER_PTS(buffer);
+ frame_meta->msg = msg;
+ spice_msg_in_ref(msg);
+ frame_meta->sample = NULL;
+ return frame_meta;
+}
+
+static void free_frame_meta(SpiceFrameMeta *frame_meta)
+{
+ spice_msg_in_unref(frame_meta->msg);
+ if (frame_meta->sample) {
+ gst_sample_unref(frame_meta->sample);
+ }
+ free(frame_meta);
+}
+
+static SpiceFrameMeta *pop_buffer_frame_meta(SpiceGstDecoder *decoder, GstBuffer *buffer)
+{
+ SpiceFrameMeta *frame_meta;
+ while ((frame_meta = g_queue_pop_head(decoder->frame_meta_queue))) {
+ if (frame_meta->timestamp == GST_BUFFER_PTS(buffer)) {
+ return frame_meta;
+ }
+ /* The corresponding frame was dropped by the GStreamer pipeline
+ * or the pipeline was reset while it was processing a frame.
+ */
+ SPICE_DEBUG("the GStreamer pipeline dropped a frame");
+ spice_msg_in_unref(frame_meta->msg);
+ free(frame_meta);
+ }
+ return NULL;
+}
+
+
+/* ---------- GStreamer pipeline ---------- */
+
+static void schedule_frame(SpiceGstDecoder *decoder);
+
+/* main context */
+static gboolean display_frame(gpointer video_decoder)
+{
+ SpiceGstDecoder *decoder = (SpiceGstDecoder*)video_decoder;
+
+ decoder->timer_id = 0;
+
+ g_mutex_lock(&decoder->display_mutex);
+ SpiceFrameMeta *frame_meta = g_queue_pop_head(decoder->display_queue);
+ g_mutex_unlock(&decoder->display_mutex);
+ g_return_val_if_fail(frame_meta, G_SOURCE_REMOVE);
+
+ GstBuffer *buffer = frame_meta->sample ? gst_sample_get_buffer(frame_meta->sample) : NULL;
+ GstMapInfo mapinfo;
+ if (!frame_meta->sample) {
+ spice_warning("error: got a frame without a sample!");
+ } else if (gst_buffer_map(buffer, &mapinfo, GST_MAP_READ)) {
+ stream_display_frame(decoder->base.stream, frame_meta->msg, mapinfo.data);
+ gst_buffer_unmap(buffer, &mapinfo);
+ } else {
+ spice_warning("GStreamer error: could not map the buffer");
+ }
+ free_frame_meta(frame_meta);
+
+ schedule_frame(decoder);
+ return G_SOURCE_REMOVE;
+}
+
+/* main loop or GStreamer streaming thread */
+static void schedule_frame(SpiceGstDecoder *decoder)
+{
+ guint32 time = stream_get_time(decoder->base.stream);
+ while (!decoder->timer_id) {
+ g_mutex_lock(&decoder->display_mutex);
+
+ SpiceFrameMeta *frame_meta = g_queue_peek_head(decoder->display_queue);
+ if (!frame_meta) {
+ g_mutex_unlock(&decoder->display_mutex);
+ break;
+ }
+
+ SpiceStreamDataHeader *op = spice_msg_in_parsed(frame_meta->msg);
+ if (time < op->multi_media_time) {
+ decoder->timer_id = g_timeout_add(op->multi_media_time - time,
+ display_frame, decoder);
+ } else {
+ SPICE_DEBUG("%s: rendering too late by %u ms (ts: %u, mmtime: %u), dropping ",
+ __FUNCTION__, time - op->multi_media_time,
+ op->multi_media_time, time);
+ g_queue_pop_head(decoder->display_queue);
+ free_frame_meta(frame_meta);
+ }
+
+ g_mutex_unlock(&decoder->display_mutex);
+ }
+}
+
+/* GStreamer thread
+ *
+ * We cannot use GStreamer's signals because they are not always run in
+ * the main context. So use a callback (lower overhead) and have it pull
+ * the sample to avoid a race with free_pipeline(). This means queuing the
+ * decoded frames outside GStreamer. So while we're at it, also schedule
+ * the frame display ourselves in schedule_frame().
+ */
+GstFlowReturn new_sample(GstAppSink *gstappsink, gpointer video_decoder)
+{
+ SpiceGstDecoder *decoder = (SpiceGstDecoder*)video_decoder;
+
+ GstSample *sample = gst_app_sink_pull_sample(decoder->appsink);
+ GstBuffer *buffer = sample ? gst_sample_get_buffer(sample) : NULL;
+ if (sample) {
+ /* Ensure the video decoder object will still be there when
+ * schedule_frame() or display_frame() runs.
+ */
+ g_mutex_lock(&decoder->display_mutex);
+
+ SpiceFrameMeta *frame_meta = pop_buffer_frame_meta(decoder, buffer);
+ if (frame_meta) {
+ frame_meta->sample = sample;
+ g_queue_push_tail(decoder->display_queue, frame_meta);
+ } else {
+ spice_warning("error: lost a buffer meta data!");
+ gst_sample_unref(sample);
+ }
+
+ g_mutex_unlock(&decoder->display_mutex);
+ schedule_frame(decoder);
+ } else {
+ spice_warning("GStreamer error: could not pull sample");
+ }
+ return GST_FLOW_OK;
+}
+
+static void free_pipeline(SpiceGstDecoder *decoder)
+{
+ if (!decoder->pipeline) {
+ return;
+ }
+
+ gst_element_set_state(decoder->pipeline, GST_STATE_NULL);
+ gst_object_unref(decoder->appsrc);
+ gst_object_unref(decoder->appsink);
+ gst_object_unref(decoder->pipeline);
+ gst_object_unref(decoder->clock);
+ decoder->pipeline = NULL;
+}
+
+static gboolean create_pipeline(SpiceGstDecoder *decoder)
+{
+ const gchar *src_caps, *gstdec_name;
+ switch (decoder->base.codec_type) {
+ case SPICE_VIDEO_CODEC_TYPE_MJPEG:
+ src_caps = "caps=image/jpeg";
+ gstdec_name = "jpegdec";
+ break;
+ case SPICE_VIDEO_CODEC_TYPE_VP8:
+ /* typefind is unable to identify VP8 streams by design.
+ * See: https://bugzilla.gnome.org/show_bug.cgi?id=756457
+ */
+ src_caps = "caps=video/x-vp8";
+ gstdec_name = "vp8dec";
+ break;
+ case SPICE_VIDEO_CODEC_TYPE_H264:
+ /* h264 streams detection works fine and setting an incomplete cap
+ * causes errors. So let typefind do all the work.
+ */
+ src_caps = "";
+ gstdec_name = "h264parse ! avdec_h264";
+ break;
+ default:
+ spice_warning("Unknown codec type %d", decoder->base.codec_type);
+ return -1;
+ }
+
+ /* - We schedule the frame display ourselves so set sync=false on appsink
+ * so the pipeline decodes them as fast as possible. This will also
+ * minimize the risk of frames getting lost when we rebuild the pipeline.
+ * - Set qos=true on appsink to the elements in the GStreamer pipeline
+ * try to keep up with realtime.
+ * - Set drop=false on appsink and block=true on appsrc so that when
+ * the pipeline really cannot keep up delays bubble up the pipeline
+ * all the way to queue_frame() where they may result in potentially
+ * helpful bandwidth adjustments on the Spice server.
+ * - Set max-bytes=0 on appsrc so appsrc does not drop frames that may
+ * be needed by those that follow.
+ */
+ gchar *desc = g_strdup_printf("appsrc name=src is-live=true format=time max-bytes=0 block=true %s ! %s ! videoconvert ! appsink name=sink caps=video/x-raw,format=BGRx sync=false qos=true drop=false", src_caps, gstdec_name);
+ SPICE_DEBUG("GStreamer pipeline: %s", desc);
+
+ GError *err = NULL;
+ decoder->pipeline = gst_parse_launch_full(desc, NULL, GST_PARSE_FLAG_FATAL_ERRORS, &err);
+ g_free(desc);
+ if (!decoder->pipeline) {
+ spice_warning("GStreamer error: %s", err->message);
+ g_clear_error(&err);
+ return FALSE;
+ }
+
+ decoder->appsrc = GST_APP_SRC(gst_bin_get_by_name(GST_BIN(decoder->pipeline), "src"));
+ decoder->appsink = GST_APP_SINK(gst_bin_get_by_name(GST_BIN(decoder->pipeline), "sink"));
+ GstAppSinkCallbacks appsink_cbs = {NULL, NULL, &new_sample, {NULL}};
+ gst_app_sink_set_callbacks(decoder->appsink, &appsink_cbs, decoder, NULL);
+
+ decoder->clock = gst_pipeline_get_clock(GST_PIPELINE(decoder->pipeline));
+
+ if (gst_element_set_state(decoder->pipeline, GST_STATE_PLAYING) == GST_STATE_CHANGE_FAILURE) {
+ SPICE_DEBUG("GStreamer error: Unable to set the pipeline to the playing state.");
+ free_pipeline(decoder);
+ return FALSE;
+ }
+
+ return TRUE;
+}
+
+
+/* ---------- VideoDecoder's public API ---------- */
+
+static void spice_gst_decoder_reschedule(VideoDecoder *video_decoder)
+{
+ SpiceGstDecoder *decoder = (SpiceGstDecoder*)video_decoder;
+ if (decoder->timer_id != 0) {
+ g_source_remove(decoder->timer_id);
+ decoder->timer_id = 0;
+ }
+ schedule_frame(decoder);
+}
+
+/* main context */
+static void spice_gst_decoder_destroy(VideoDecoder *video_decoder)
+{
+ SpiceGstDecoder *decoder = (SpiceGstDecoder*)video_decoder;
+
+ /* Stop and free the pipeline to ensure there will not be any further
+ * new_sample() call (clearing thread-safety concerns).
+ */
+ free_pipeline(decoder);
+ g_mutex_clear(&decoder->display_mutex);
+
+ /* Even if we kept the decoder around, once we return the stream will be
+ * destroyed making it impossible to display frames. So cancel any
+ * scheduled display_frame() call and drop the queued frames.
+ */
+ if (decoder->timer_id) {
+ g_source_remove(decoder->timer_id);
+ }
+ SpiceFrameMeta *frame_meta;
+ while ((frame_meta = g_queue_pop_head(decoder->display_queue))) {
+ free_frame_meta(frame_meta);
+ }
+ g_queue_free(decoder->display_queue);
+ while ((frame_meta = g_queue_pop_head(decoder->frame_meta_queue))) {
+ free_frame_meta(frame_meta);
+ }
+ g_queue_free(decoder->frame_meta_queue);
+
+ free(decoder);
+
+ /* Don't call gst_deinit() as other parts of the client
+ * may still be using GStreamer.
+ */
+}
+
+static void release_buffer_data(gpointer data)
+{
+ SpiceMsgIn* frame_msg = (SpiceMsgIn*)data;
+ spice_msg_in_unref(frame_msg);
+}
+
+static void spice_gst_decoder_queue_frame(VideoDecoder *video_decoder,
+ SpiceMsgIn *frame_msg,
+ int32_t latency)
+{
+ SpiceGstDecoder *decoder = (SpiceGstDecoder*)video_decoder;
+
+ uint8_t *data;
+ uint32_t size = spice_msg_in_frame_data(frame_msg, &data);
+ if (size == 0) {
+ SPICE_DEBUG("got an empty frame buffer!");
+ return;
+ }
+
+ SpiceStreamDataHeader *frame_op = spice_msg_in_parsed(frame_msg);
+ if (frame_op->multi_media_time < decoder->last_mm_time) {
+ SPICE_DEBUG("new-frame-time < last-frame-time (%u < %u):"
+ " resetting stream, id %d",
+ frame_op->multi_media_time,
+ decoder->last_mm_time, frame_op->id);
+ /* Let GStreamer deal with the frame anyway */
+ }
+ decoder->last_mm_time = frame_op->multi_media_time;
+
+ if (latency < 0 &&
+ decoder->base.codec_type == SPICE_VIDEO_CODEC_TYPE_MJPEG) {
+ /* Dropping MJPEG frames has no impact on those that follow and
+ * saves CPU so do it.
+ */
+ SPICE_DEBUG("dropping a late MJPEG frame");
+ return;
+ }
+
+ int width, height;
+ stream_get_dimensions(decoder->base.stream, frame_msg, &width, &height);
+ if (width != decoder->width || height != decoder->height) {
+ SPICE_DEBUG("video format change: width %d -> %d, height %d -> %d", decoder->width, width, decoder->height, height);
+ decoder->width = width;
+ decoder->height = height;
+ /* TODO It would be better to flush the pipeline here */
+ free_pipeline(decoder);
+ }
+ if (!decoder->pipeline && !create_pipeline(decoder)) {
+ stream_dropped_frame(decoder->base.stream);
+ return;
+ }
+
+ /* ref() the frame_msg for the buffer */
+ spice_msg_in_ref(frame_msg);
+ GstBuffer *buffer = gst_buffer_new_wrapped_full(GST_MEMORY_FLAG_PHYSICALLY_CONTIGUOUS,
+ data, size, 0, size,
+ frame_msg, &release_buffer_data);
+
+ GST_BUFFER_DURATION(buffer) = GST_CLOCK_TIME_NONE;
+ GST_BUFFER_DTS(buffer) = GST_CLOCK_TIME_NONE;
+ GST_BUFFER_PTS(buffer) = gst_clock_get_time(decoder->clock) - gst_element_get_base_time(decoder->pipeline) + ((uint64_t)latency) * 1000 * 1000;
+
+ g_mutex_lock(&decoder->display_mutex);
+ g_queue_push_tail(decoder->frame_meta_queue, create_frame_meta(buffer, frame_msg));
+ g_mutex_unlock(&decoder->display_mutex);
+
+ if (gst_app_src_push_buffer(decoder->appsrc, buffer) != GST_FLOW_OK) {
+ SPICE_DEBUG("GStreamer error: unable to push frame of size %d", size);
+ stream_dropped_frame(decoder->base.stream);
+ }
+}
+
+G_GNUC_INTERNAL
+gboolean gstvideo_init(void)
+{
+ static int success = 0;
+ if (!success) {
+ GError *err = NULL;
+ if (gst_init_check(NULL, NULL, &err)) {
+ success = 1;
+ } else {
+ spice_warning("Disabling GStreamer video support: %s", err->message);
+ g_clear_error(&err);
+ success = -1;
+ }
+ }
+ return success > 0;
+}
+
+G_GNUC_INTERNAL
+VideoDecoder* create_gstreamer_decoder(int codec_type, display_stream *stream)
+{
+ SpiceGstDecoder *decoder = NULL;
+
+ if (gstvideo_init()) {
+ decoder = spice_new0(SpiceGstDecoder, 1);
+ decoder->base.destroy = spice_gst_decoder_destroy;
+ decoder->base.reschedule = spice_gst_decoder_reschedule;
+ decoder->base.queue_frame = spice_gst_decoder_queue_frame;
+ decoder->base.codec_type = codec_type;
+ decoder->base.stream = stream;
+ g_mutex_init(&decoder->display_mutex);
+ decoder->display_queue = g_queue_new();
+ decoder->frame_meta_queue = g_queue_new();
+ }
+
+ return (VideoDecoder*)decoder;
+}
diff --git a/src/channel-display-priv.h b/src/channel-display-priv.h
index 92cba50..b6ace81 100644
--- a/src/channel-display-priv.h
+++ b/src/channel-display-priv.h
@@ -69,6 +69,12 @@ struct VideoDecoder {
* @return: A pointer to a structure implementing the VideoDecoder methods.
*/
VideoDecoder* create_mjpeg_decoder(int codec_type, display_stream *stream);
+#ifdef HAVE_GSTVIDEO
+VideoDecoder* create_gstreamer_decoder(int codec_type, display_stream *stream);
+gboolean gstvideo_init(void);
+#else
+# define gstvideo_init() FALSE
+#endif


typedef struct display_surface {
diff --git a/src/channel-display.c b/src/channel-display.c
index e591add..3d31596 100644
--- a/src/channel-display.c
+++ b/src/channel-display.c
@@ -716,6 +716,12 @@ static void spice_display_channel_reset_capabilities(SpiceChannel *channel)
#ifdef G_OS_UNIX
spice_channel_set_capability(SPICE_CHANNEL(channel), SPICE_DISPLAY_CAP_GL_SCANOUT);
#endif
+ spice_channel_set_capability(SPICE_CHANNEL(channel), SPICE_DISPLAY_CAP_MULTI_CODEC);
+ spice_channel_set_capability(SPICE_CHANNEL(channel), SPICE_DISPLAY_CAP_CODEC_MJPEG);
+ if (gstvideo_init()) {
+ spice_channel_set_capability(SPICE_CHANNEL(channel), SPICE_DISPLAY_CAP_CODEC_VP8);
+ spice_channel_set_capability(SPICE_CHANNEL(channel), SPICE_DISPLAY_CAP_CODEC_H264);
+ }
}

static void destroy_surface(gpointer data)
@@ -1099,7 +1105,11 @@ static void display_handle_stream_create(SpiceChannel *channel, SpiceMsgIn *in)
st->video_decoder = create_mjpeg_decoder(op->codec_type, st);
break;
default:
+#ifdef HAVE_GSTVIDEO
+ st->video_decoder = create_gstreamer_decoder(op->codec_type, st);
+#else
st->video_decoder = NULL;
+#endif
}
if (st->video_decoder == NULL) {
spice_printerr("could not create a video decoder for codec %d", op->codec_type);
--
2.8.0.rc3
Francois Gouget
2016-04-05 15:17:06 UTC
Permalink
This makes it possible to test the GStreamer video decoder with MJPEG
streams.

Signed-off-by: Francois Gouget <***@codeweavers.com>
---
configure.ac | 11 +++++++++++
src/Makefile.am | 7 ++++++-
src/channel-display-priv.h | 2 ++
src/channel-display.c | 5 +++++
4 files changed, 24 insertions(+), 1 deletion(-)

diff --git a/configure.ac b/configure.ac
index 4609382..3118659 100644
--- a/configure.ac
+++ b/configure.ac
@@ -280,6 +280,17 @@ AS_IF([test "x$enable_gstvideo" != "xno"],
)
AM_CONDITIONAL([HAVE_GSTVIDEO], [test "x$have_gstvideo" = "xyes"])

+AC_ARG_ENABLE([builtin-mjpeg],
+ AS_HELP_STRING([--enable-builtin-mjpeg], [Enable the builtin mjpeg video decoder @<:@default=yes@:>@]),
+ [],
+ enable_builtin_mjpeg="yes")
+AS_IF([test "x$enable_builtin_mjpeg" = "xyes"],
+ [AC_DEFINE([HAVE_BUILTIN_MJPEG], 1, [Use the builtin mjpeg decoder?])])
+AM_CONDITIONAL(HAVE_BUILTIN_MJPEG, [test "x$enable_builtin_mjpeg" != "xno"])
+
+AS_IF([test "x$enable_builtin_mjpeg$enable_gstvideo" = "xnono"],
+ [SPICE_WARNING([No builtin MJPEG or GStreamer decoder, video will not be streamed])])
+
AC_CHECK_LIB(jpeg, jpeg_destroy_decompress,
AC_MSG_CHECKING([for jpeglib.h])
AC_TRY_CPP(
diff --git a/src/Makefile.am b/src/Makefile.am
index 317e993..73bb39c 100644
--- a/src/Makefile.am
+++ b/src/Makefile.am
@@ -242,7 +242,6 @@ libspice_client_glib_2_0_la_SOURCES = \
channel-cursor.c \
channel-display.c \
channel-display-priv.h \
- channel-display-mjpeg.c \
channel-inputs.c \
channel-main.c \
channel-playback.c \
@@ -330,6 +329,12 @@ libspice_client_glib_2_0_la_SOURCES += \
$(NULL)
endif

+if HAVE_BUILTIN_MJPEG
+libspice_client_glib_2_0_la_SOURCES += \
+ channel-display-mjpeg.c \
+ $(NULL)
+endif
+
if HAVE_GSTVIDEO
libspice_client_glib_2_0_la_SOURCES += \
channel-display-gst.c \
diff --git a/src/channel-display-priv.h b/src/channel-display-priv.h
index b6ace81..3989899 100644
--- a/src/channel-display-priv.h
+++ b/src/channel-display-priv.h
@@ -68,7 +68,9 @@ struct VideoDecoder {
* @stream: The associated video stream.
* @return: A pointer to a structure implementing the VideoDecoder methods.
*/
+#ifdef HAVE_BUILTIN_MJPEG
VideoDecoder* create_mjpeg_decoder(int codec_type, display_stream *stream);
+#endif
#ifdef HAVE_GSTVIDEO
VideoDecoder* create_gstreamer_decoder(int codec_type, display_stream *stream);
gboolean gstvideo_init(void);
diff --git a/src/channel-display.c b/src/channel-display.c
index 3d31596..4ea1ba7 100644
--- a/src/channel-display.c
+++ b/src/channel-display.c
@@ -717,8 +717,11 @@ static void spice_display_channel_reset_capabilities(SpiceChannel *channel)
spice_channel_set_capability(SPICE_CHANNEL(channel), SPICE_DISPLAY_CAP_GL_SCANOUT);
#endif
spice_channel_set_capability(SPICE_CHANNEL(channel), SPICE_DISPLAY_CAP_MULTI_CODEC);
+#ifdef HAVE_BUILTIN_MJPEG
spice_channel_set_capability(SPICE_CHANNEL(channel), SPICE_DISPLAY_CAP_CODEC_MJPEG);
+#endif
if (gstvideo_init()) {
+ spice_channel_set_capability(SPICE_CHANNEL(channel), SPICE_DISPLAY_CAP_CODEC_MJPEG);
spice_channel_set_capability(SPICE_CHANNEL(channel), SPICE_DISPLAY_CAP_CODEC_VP8);
spice_channel_set_capability(SPICE_CHANNEL(channel), SPICE_DISPLAY_CAP_CODEC_H264);
}
@@ -1101,9 +1104,11 @@ static void display_handle_stream_create(SpiceChannel *channel, SpiceMsgIn *in)
display_update_stream_region(st);

switch (op->codec_type) {
+#ifdef HAVE_BUILTIN_MJPEG
case SPICE_VIDEO_CODEC_TYPE_MJPEG:
st->video_decoder = create_mjpeg_decoder(op->codec_type, st);
break;
+#endif
default:
#ifdef HAVE_GSTVIDEO
st->video_decoder = create_gstreamer_decoder(op->codec_type, st);
--
2.8.0.rc3
Francois Gouget
2016-04-05 15:17:14 UTC
Permalink
configure will use GStreamer 1.0 if present and fall back to
GStreamer 0.10 otherwise.
ffenc_mjpeg takes its bitrate as a long so extend set_gstenc_bitrate().

Signed-off-by: Francois Gouget <***@codeweavers.com>
---
configure.ac | 36 ++++++++++---
server/Makefile.am | 8 +++
server/gstreamer-encoder.c | 131 +++++++++++++++++++++++++++++++++++----------
server/reds.c | 2 +-
server/video-encoder.h | 2 +-
5 files changed, 142 insertions(+), 37 deletions(-)

diff --git a/configure.ac b/configure.ac
index 1e98523..66db560 100644
--- a/configure.ac
+++ b/configure.ac
@@ -70,28 +70,48 @@ dnl Check optional features
SPICE_CHECK_SMARTCARD

AC_ARG_ENABLE(gstreamer,
- AS_HELP_STRING([--enable-gstreamer=@<:@auto/yes/no@:>@],
- [Enable GStreamer 1.0 support]),,
+ AS_HELP_STRING([--enable-gstreamer=@<:@auto/0.10/1.0/yes/no@:>@],
+ [Enable GStreamer support]),,
[enable_gstreamer="auto"])

-if test "x$enable_gstreamer" != "xno"; then
+if test "x$enable_gstreamer" != "xno" && test "x$enable_gstreamer" != "x0.10"; then
SPICE_CHECK_GSTREAMER(GSTREAMER_1_0, 1.0, [gstreamer-1.0 gstreamer-base-1.0 gstreamer-app-1.0 gstreamer-video-1.0],
- [enable_gstreamer="yes"
+ [enable_gstreamer="1.0"
SPICE_CHECK_GSTREAMER_ELEMENTS($GST_INSPECT_1_0, [gst-plugins-base 1.0], [appsrc videoconvert appsink])
SPICE_CHECK_GSTREAMER_ELEMENTS($GST_INSPECT_1_0, [gstreamer-libav 1.0], [avenc_mjpeg])
SPICE_CHECK_GSTREAMER_ELEMENTS($GST_INSPECT_1_0, [gst-plugins-good 1.0], [vp8enc])
SPICE_CHECK_GSTREAMER_ELEMENTS($GST_INSPECT_1_0, [gst-plugins-ugly 1.0], [x264enc])
],
- [if test "x$enable_gstreamer" = "xyes"; then
+ [if test "x$enable_gstreamer" = "x1.0"; then
AC_MSG_ERROR([GStreamer 1.0 support requested but not found. You may set GSTREAMER_1_0_CFLAGS and GSTREAMER_1_0_LIBS to avoid the need to call pkg-config.])
fi
])
fi
AM_CONDITIONAL(HAVE_GSTREAMER_1_0, test "x$have_gstreamer_1_0" = "xyes")

-if test x"$gstreamer_missing" != x; then
- SPICE_WARNING([The following GStreamer $enable_gstreamer tools/elements are missing:$gstreamer_missing. The GStreamer video encoder can be built but may not work.])
+if test "x$enable_gstreamer" != "xno" && test "x$enable_gstreamer" != "x1.0"; then
+ SPICE_CHECK_GSTREAMER(GSTREAMER_0_10, 0.10, [gstreamer-0.10 gstreamer-base-0.10 gstreamer-app-0.10 gstreamer-video-0.10],
+ [enable_gstreamer="0.10"
+ SPICE_CHECK_GSTREAMER_ELEMENTS($GST_INSPECT_0_10, [gst-plugins-base 0.10], [appsrc appsink])
+ SPICE_CHECK_GSTREAMER_ELEMENTS($GST_INSPECT_0_10, [gstreamer-ffmpeg 0.10], [ffmpegcolorspace ffenc_mjpeg])
+ SPICE_CHECK_GSTREAMER_ELEMENTS($GST_INSPECT_0_10, [gst-plugins-bad 0.10], [vp8enc])
+ SPICE_CHECK_GSTREAMER_ELEMENTS($GST_INSPECT_0_10, [gst-plugins-ugly 0.10], [x264enc])
+ ],
+ [if test "x$enable_gstreamer" = "x0.10"; then
+ AC_MSG_ERROR([GStreamer 0.10 support requested but not found. You may set GSTREAMER_0_10_CFLAGS and GSTREAMER_0_10_LIBS to avoid the need to call pkg-config.])
+ fi
+ ])
fi
+AM_CONDITIONAL(HAVE_GSTREAMER_0_10, test "x$have_gstreamer_0_10" = "xyes")
+
+AS_IF([test "x$enable_gstreamer" = "xyes"],
+ [AC_MSG_ERROR("GStreamer support requested but not found")],
+ [test "x$enable_gstreamer" = "xauto"],
+ [enable_gstreamer="no"
+])
+AS_IF([test x"$missing_gstreamer_elements" = xyes],
+ [SPICE_WARNING([The GStreamer video encoder can be built but may not work.])
+])

AC_ARG_ENABLE([automated_tests],
AS_HELP_STRING([--enable-automated-tests], [Enable automated tests using spicy-screenshot (part of spice--gtk)]),,
@@ -271,7 +291,7 @@ AC_MSG_NOTICE([

LZ4 support: ${enable_lz4}
Smartcard: ${have_smartcard}
- GStreamer 1.0: ${have_gstreamer_1_0}
+ GStreamer: ${enable_gstreamer}
SASL support: ${have_sasl}
Automated tests: ${enable_automated_tests}
Manual: ${have_asciidoc}
diff --git a/server/Makefile.am b/server/Makefile.am
index 6149b7b..14dbc81 100644
--- a/server/Makefile.am
+++ b/server/Makefile.am
@@ -12,6 +12,7 @@ AM_CPPFLAGS = \
$(SASL_CFLAGS) \
$(SLIRP_CFLAGS) \
$(SMARTCARD_CFLAGS) \
+ $(GSTREAMER_0_10_CFLAGS) \
$(GSTREAMER_1_0_CFLAGS) \
$(SPICE_PROTOCOL_CFLAGS) \
$(SSL_CFLAGS) \
@@ -46,6 +47,7 @@ libserver_la_LIBADD = \
$(PIXMAN_LIBS) \
$(SASL_LIBS) \
$(SLIRP_LIBS) \
+ $(GSTREAMER_0_10_LIBS) \
$(GSTREAMER_1_0_LIBS) \
$(SSL_LIBS) \
$(Z_LIBS) \
@@ -154,6 +156,12 @@ libserver_la_SOURCES += \
$(NULL)
endif

+if HAVE_GSTREAMER_0_10
+libserver_la_SOURCES += \
+ gstreamer-encoder.c \
+ $(NULL)
+endif
+
if HAVE_GSTREAMER_1_0
libserver_la_SOURCES += \
gstreamer-encoder.c \
diff --git a/server/gstreamer-encoder.c b/server/gstreamer-encoder.c
index 3ff6343..7f07060 100644
--- a/server/gstreamer-encoder.c
+++ b/server/gstreamer-encoder.c
@@ -33,19 +33,28 @@

#define SPICE_GST_DEFAULT_FPS 30

-#define DO_ZERO_COPY
+#ifndef HAVE_GSTREAMER_0_10
+# define DO_ZERO_COPY
+#endif


typedef struct {
SpiceBitmapFmt spice_format;
const char *format;
uint32_t bpp;
+ uint32_t depth;
+ uint32_t endianness;
+ uint32_t blue_mask;
+ uint32_t green_mask;
+ uint32_t red_mask;
} SpiceFormatForGStreamer;

typedef struct SpiceGstVideoBuffer {
VideoBuffer base;
GstBuffer *gst_buffer;
+#ifndef HAVE_GSTREAMER_0_10
GstMapInfo map;
+#endif
} SpiceGstVideoBuffer;

typedef struct {
@@ -272,6 +281,9 @@ static void spice_gst_video_buffer_free(VideoBuffer *video_buffer)
{
SpiceGstVideoBuffer *buffer = (SpiceGstVideoBuffer*)video_buffer;
if (buffer->gst_buffer) {
+#ifndef HAVE_GSTREAMER_0_10
+ gst_buffer_unmap(buffer->gst_buffer, &buffer->map);
+#endif
gst_buffer_unref(buffer->gst_buffer);
}
free(buffer);
@@ -726,11 +738,11 @@ static const SpiceFormatForGStreamer *map_format(SpiceBitmapFmt format)
* section-types-definitions.html documents.
*/
static const SpiceFormatForGStreamer format_map[] = {
- {SPICE_BITMAP_FMT_RGBA, "BGRA", 32},
- {SPICE_BITMAP_FMT_16BIT, "RGB15", 16},
+ {SPICE_BITMAP_FMT_RGBA, "BGRA", 32, 24, 4321, 0xff000000, 0xff0000, 0xff00},
+ {SPICE_BITMAP_FMT_16BIT, "RGB15", 16, 15, 4321, 0x001f, 0x03E0, 0x7C00},
/* TODO: Test the other formats */
- {SPICE_BITMAP_FMT_32BIT, "BGRx", 32},
- {SPICE_BITMAP_FMT_24BIT, "BGR", 24},
+ {SPICE_BITMAP_FMT_32BIT, "BGRx", 32, 24, 4321, 0xff000000, 0xff0000, 0xff00},
+ {SPICE_BITMAP_FMT_24BIT, "BGR", 24, 24, 4321, 0xff0000, 0xff00, 0xff},
};

int i;
@@ -752,8 +764,18 @@ static void set_appsrc_caps(SpiceGstEncoder *encoder)
gst_caps_unref(encoder->src_caps);
}
encoder->src_caps = gst_caps_new_simple(
+#ifdef HAVE_GSTREAMER_0_10
+ "video/x-raw-rgb",
+ "bpp", G_TYPE_INT, encoder->format->bpp,
+ "depth", G_TYPE_INT, encoder->format->depth,
+ "endianness", G_TYPE_INT, encoder->format->endianness,
+ "red_mask", G_TYPE_INT, encoder->format->red_mask,
+ "green_mask", G_TYPE_INT, encoder->format->green_mask,
+ "blue_mask", G_TYPE_INT, encoder->format->blue_mask,
+#else
"video/x-raw",
"format", G_TYPE_STRING, encoder->format->format,
+#endif
"width", G_TYPE_INT, encoder->width,
"height", G_TYPE_INT, encoder->height,
"framerate", GST_TYPE_FRACTION, get_source_fps(encoder), 1,
@@ -785,7 +807,11 @@ static const gchar* get_gst_codec_name(SpiceGstEncoder *encoder)
switch (encoder->base.codec_type)
{
case SPICE_VIDEO_CODEC_TYPE_MJPEG:
+#ifdef HAVE_GSTREAMER_0_10
+ return "ffenc_mjpeg";
+#else
return "avenc_mjpeg";
+#endif
case SPICE_VIDEO_CODEC_TYPE_VP8:
return "vp8enc";
case SPICE_VIDEO_CODEC_TYPE_H264:
@@ -799,6 +825,11 @@ static const gchar* get_gst_codec_name(SpiceGstEncoder *encoder)

static gboolean create_pipeline(SpiceGstEncoder *encoder)
{
+#ifdef HAVE_GSTREAMER_0_10
+ const gchar *converter = "ffmpegcolorspace";
+#else
+ const gchar *converter = "videoconvert";
+#endif
const gchar* gstenc_name = get_gst_codec_name(encoder);
if (!gstenc_name) {
return FALSE;
@@ -807,30 +838,39 @@ static gboolean create_pipeline(SpiceGstEncoder *encoder)
switch (encoder->base.codec_type)
{
case SPICE_VIDEO_CODEC_TYPE_MJPEG:
+#ifdef HAVE_GSTREAMER_0_10
+ gstenc_opts = g_strdup("");
+#else
/* Set max-threads to ensure zero-frame latency */
gstenc_opts = g_strdup("max-threads=1");
+#endif
break;
case SPICE_VIDEO_CODEC_TYPE_VP8: {
/* See http://www.webmproject.org/docs/encoder-parameters/
- * - Set end-usage to get a constant bitrate to help with streaming.
+ * - Set mode/end-usage to get a constant bitrate to help with
+ * streaming.
* - resize-allowed allows trading resolution for low bitrates while
* min-quantizer ensures the bitrate does not get needlessly high.
* - error-resilient minimises artifacts in case the client drops a
* frame.
* - Set lag-in-frames, deadline and cpu-used to match
- * "Profile Realtime". lag-in-frames ensures zero-frame latency,
- * deadline turns on realtime behavior, and cpu-used targets a 75%
- * CPU usage.
+ * "Profile Realtime". max-latency/lag-in-frames ensures zero-frame
+ * latency, deadline turns on realtime behavior, cpu-used targets a
+ * 75% CPU usage while speed simply prioritizes encoding speed.
* - deadline is supposed to be set in microseconds but in practice
* it behaves like a boolean.
* - At least up to GStreamer 1.6.2, vp8enc cannot be trusted to pick
* the optimal number of threads. Also exceeding the number of
* physical core really degrades image quality.
- * - token-partitions parallelizes more operations.
+ * - token-parts/token-partitions parallelizes more operations.
*/
int threads = get_physical_core_count();
int parts = threads < 2 ? 0 : threads < 4 ? 1 : threads < 8 ? 2 : 3;
+#ifdef HAVE_GSTREAMER_0_10
+ gstenc_opts = g_strdup_printf("mode=cbr min-quantizer=10 resize-allowed=true error-resilient=true max-latency=0 speed=7 threads=%d token-parts=%d", threads, parts);
+#else
gstenc_opts = g_strdup_printf("end-usage=cbr min-quantizer=10 resize-allowed=true error-resilient=true lag-in-frames=0 deadline=1 cpu-used=4 threads=%d token-partitions=%d", threads, parts);
+#endif
break;
}
case SPICE_VIDEO_CODEC_TYPE_H264:
@@ -849,7 +889,7 @@ static gboolean create_pipeline(SpiceGstEncoder *encoder)
}

GError *err = NULL;
- gchar *desc = g_strdup_printf("appsrc is-live=true format=time do-timestamp=true name=src ! videoconvert ! %s %s name=encoder ! appsink name=sink", gstenc_name, gstenc_opts);
+ gchar *desc = g_strdup_printf("appsrc is-live=true format=time do-timestamp=true name=src ! %s ! %s %s name=encoder ! appsink name=sink", converter, gstenc_name, gstenc_opts);
spice_debug("GStreamer pipeline: %s", desc);
encoder->pipeline = gst_parse_launch_full(desc, NULL, GST_PARSE_FLAG_FATAL_ERRORS, &err);
g_free(gstenc_opts);
@@ -898,6 +938,18 @@ static void set_gstenc_bitrate(SpiceGstEncoder *encoder)
g_object_set(gobject, prop, (guint)gst_bit_rate, NULL);
break;
}
+ case G_TYPE_LONG: {
+ GParamSpecLong *range = G_PARAM_SPEC_LONG(param);
+ gst_bit_rate = MAX(range->minimum, MIN(range->maximum, gst_bit_rate));
+ g_object_set(gobject, prop, (glong)gst_bit_rate, NULL);
+ break;
+ }
+ case G_TYPE_ULONG: {
+ GParamSpecULong *range = G_PARAM_SPEC_ULONG(param);
+ gst_bit_rate = MAX(range->minimum, MIN(range->maximum, gst_bit_rate));
+ g_object_set(gobject, prop, (gulong)gst_bit_rate, NULL);
+ break;
+ }
case G_TYPE_INT64: {
GParamSpecInt64 *range = G_PARAM_SPEC_INT64(param);
gst_bit_rate = MAX(range->minimum, MIN(range->maximum, gst_bit_rate));
@@ -1148,21 +1200,44 @@ static inline int chunk_copy(SpiceGstEncoder *encoder, const SpiceBitmap *bitmap
return TRUE;
}

+#ifdef HAVE_GSTREAMER_0_10
+/* Dummy structure to avoid too many #ifdef in the main codepaths */
+typedef struct {
+ gpointer memory;
+} GstMapInfo;
+#endif
+
/* A helper for push_raw_frame() */
static uint8_t *allocate_and_map_memory(gsize size, GstMapInfo *map, GstBuffer *buffer)
{
+#ifdef HAVE_GSTREAMER_0_10
+ buffer->malloc_data = g_malloc(size);
+ GST_BUFFER_DATA(buffer) = buffer->malloc_data;
+ GST_BUFFER_SIZE(buffer) = size;
+
+ return GST_BUFFER_DATA(buffer);
+#else
GstMemory *mem = gst_allocator_alloc(NULL, size, NULL);
if (!mem) {
gst_buffer_unref(buffer);
return NULL;
}
- if (!gst_memory_map(mem, map, GST_MAP_WRITE))
- {
+ if (!gst_memory_map(mem, map, GST_MAP_WRITE)) {
gst_memory_unref(mem);
gst_buffer_unref(buffer);
return NULL;
}
return map->data;
+#endif
+}
+
+static void unmap_and_release_memory(GstMapInfo *map, GstBuffer *buffer)
+{
+#ifndef HAVE_GSTREAMER_0_10
+ gst_memory_unmap(map->memory, map);
+ gst_memory_unref(map->memory);
+#endif
+ gst_buffer_unref(buffer);
}

/* A helper for spice_gst_encoder_encode_frame() */
@@ -1195,9 +1270,7 @@ static int push_raw_frame(SpiceGstEncoder *encoder,

chunk_offset += src->left * encoder->format->bpp / 8;
if (!line_copy(encoder, bitmap, chunk_offset, stream_stride, height, dst)) {
- gst_memory_unmap(map.memory, &map);
- gst_memory_unref(map.memory);
- gst_buffer_unref(buffer);
+ unmap_and_release_memory(&map, buffer);
return VIDEO_ENCODER_FRAME_UNSUPPORTED;
}
} else {
@@ -1215,23 +1288,27 @@ static int push_raw_frame(SpiceGstEncoder *encoder,
*/
#endif

- if (len) {
- uint8_t *dst = allocate_and_map_memory(len, &map, buffer);
- if (!dst) {
- return VIDEO_ENCODER_FRAME_UNSUPPORTED;
- }
- if (!chunk_copy(encoder, bitmap, chunk_index, chunk_offset, len, dst)) {
- gst_memory_unmap(map.memory, &map);
- gst_memory_unref(map.memory);
- gst_buffer_unref(buffer);
- return VIDEO_ENCODER_FRAME_UNSUPPORTED;
- }
+ uint8_t *dst = allocate_and_map_memory(len, &map, buffer);
+
+ if (len && !dst) {
+ return VIDEO_ENCODER_FRAME_UNSUPPORTED;
+ }
+
+ if (!chunk_copy(encoder, bitmap, chunk_index, chunk_offset, len, dst)) {
+ unmap_and_release_memory(&map, buffer);
+ return VIDEO_ENCODER_FRAME_UNSUPPORTED;
}
}
+#ifdef HAVE_GSTREAMER_0_10
+ gst_buffer_set_caps(buffer, encoder->src_caps);
+#endif
+
+#ifndef HAVE_GSTREAMER_0_10
if (map.memory) {
gst_memory_unmap(map.memory, &map);
gst_buffer_append_memory(buffer, map.memory);
}
+#endif
GST_BUFFER_OFFSET(buffer) = encoder->frame++;

GstFlowReturn ret = gst_app_src_push_buffer(encoder->appsrc, buffer);
diff --git a/server/reds.c b/server/reds.c
index 03eb678..410ba09 100644
--- a/server/reds.c
+++ b/server/reds.c
@@ -3558,7 +3558,7 @@ static const EnumNames video_encoder_names[] = {

static new_video_encoder_t video_encoder_procs[] = {
&mjpeg_encoder_new,
-#ifdef HAVE_GSTREAMER_1_0
+#if defined(HAVE_GSTREAMER_1_0) || defined(HAVE_GSTREAMER_0_10)
&gstreamer_encoder_new,
#else
NULL,
diff --git a/server/video-encoder.h b/server/video-encoder.h
index 41c7f17..4073315 100644
--- a/server/video-encoder.h
+++ b/server/video-encoder.h
@@ -193,7 +193,7 @@ VideoEncoder* mjpeg_encoder_new(SpiceVideoCodecType codec_type,
VideoEncoderRateControlCbs *cbs,
bitmap_ref_t bitmap_ref,
bitmap_unref_t bitmap_unref);
-#ifdef HAVE_GSTREAMER_1_0
+#if defined(HAVE_GSTREAMER_1_0) || defined(HAVE_GSTREAMER_0_10)
VideoEncoder* gstreamer_encoder_new(SpiceVideoCodecType codec_type,
uint64_t starting_bit_rate,
VideoEncoderRateControlCbs *cbs,
--
2.8.0.rc3
Francois Gouget
2016-04-05 15:17:19 UTC
Permalink
Signed-off-by: Francois Gouget <***@codeweavers.com>
---
src/channel-display-gst.c | 17 +++++++++++++++--
src/channel-display-priv.h | 4 ++--
src/channel-display.c | 21 +++++++++++++++++----
3 files changed, 34 insertions(+), 8 deletions(-)

diff --git a/src/channel-display-gst.c b/src/channel-display-gst.c
index 95841bd..5e6bd57 100644
--- a/src/channel-display-gst.c
+++ b/src/channel-display-gst.c
@@ -399,8 +399,7 @@ static void spice_gst_decoder_queue_frame(VideoDecoder *video_decoder,
}
}

-G_GNUC_INTERNAL
-gboolean gstvideo_init(void)
+static gboolean gstvideo_init(void)
{
static int success = 0;
if (!success) {
@@ -435,3 +434,17 @@ VideoDecoder* create_gstreamer_decoder(int codec_type, display_stream *stream)

return (VideoDecoder*)decoder;
}
+
+G_GNUC_INTERNAL
+gboolean gstvideo_has_codec(int codec_type)
+{
+ gboolean has_codec = FALSE;
+
+ VideoDecoder *decoder = create_gstreamer_decoder(codec_type, NULL);
+ if (decoder) {
+ has_codec = create_pipeline((SpiceGstDecoder*)decoder);
+ decoder->destroy(decoder);
+ }
+
+ return has_codec;
+}
diff --git a/src/channel-display-priv.h b/src/channel-display-priv.h
index 3989899..e27c499 100644
--- a/src/channel-display-priv.h
+++ b/src/channel-display-priv.h
@@ -73,9 +73,9 @@ VideoDecoder* create_mjpeg_decoder(int codec_type, display_stream *stream);
#endif
#ifdef HAVE_GSTVIDEO
VideoDecoder* create_gstreamer_decoder(int codec_type, display_stream *stream);
-gboolean gstvideo_init(void);
+gboolean gstvideo_has_codec(int codec_type);
#else
-# define gstvideo_init() FALSE
+# define gstvideo_has_codec(codec_type) FALSE
#endif


diff --git a/src/channel-display.c b/src/channel-display.c
index 4ea1ba7..e5a298e 100644
--- a/src/channel-display.c
+++ b/src/channel-display.c
@@ -720,10 +720,23 @@ static void spice_display_channel_reset_capabilities(SpiceChannel *channel)
#ifdef HAVE_BUILTIN_MJPEG
spice_channel_set_capability(SPICE_CHANNEL(channel), SPICE_DISPLAY_CAP_CODEC_MJPEG);
#endif
- if (gstvideo_init()) {
- spice_channel_set_capability(SPICE_CHANNEL(channel), SPICE_DISPLAY_CAP_CODEC_MJPEG);
- spice_channel_set_capability(SPICE_CHANNEL(channel), SPICE_DISPLAY_CAP_CODEC_VP8);
- spice_channel_set_capability(SPICE_CHANNEL(channel), SPICE_DISPLAY_CAP_CODEC_H264);
+ if (gstvideo_has_codec(SPICE_VIDEO_CODEC_TYPE_MJPEG)) {
+ spice_channel_set_capability(SPICE_CHANNEL(channel),
+ SPICE_DISPLAY_CAP_CODEC_MJPEG);
+ } else {
+ spice_info("GStreamer does not support the mjpeg codec");
+ }
+ if (gstvideo_has_codec(SPICE_VIDEO_CODEC_TYPE_VP8)) {
+ spice_channel_set_capability(SPICE_CHANNEL(channel),
+ SPICE_DISPLAY_CAP_CODEC_VP8);
+ } else {
+ spice_info("GStreamer does not support the vp8 codec");
+ }
+ if (gstvideo_has_codec(SPICE_VIDEO_CODEC_TYPE_H264)) {
+ spice_channel_set_capability(SPICE_CHANNEL(channel),
+ SPICE_DISPLAY_CAP_CODEC_H264);
+ } else {
+ spice_info("GStreamer does not support the h264 codec");
}
}
--
2.8.0.rc3
Francois Gouget
2016-04-05 15:17:25 UTC
Permalink
This means future video codecs may be supported automatically.
One can also force usage of decodebin by setting $SPICE_GSTVIDEO_AUTO.

Signed-off-by: Francois Gouget <***@codeweavers.com>
---
src/channel-display-gst.c | 16 ++++++++++++++--
1 file changed, 14 insertions(+), 2 deletions(-)

diff --git a/src/channel-display-gst.c b/src/channel-display-gst.c
index 5e6bd57..a063da8 100644
--- a/src/channel-display-gst.c
+++ b/src/channel-display-gst.c
@@ -235,8 +235,20 @@ static gboolean create_pipeline(SpiceGstDecoder *decoder)
gstdec_name = "h264parse ! avdec_h264";
break;
default:
- spice_warning("Unknown codec type %d", decoder->base.codec_type);
- return -1;
+ SPICE_DEBUG("Unknown codec type %d. Trying decodebin.",
+ decoder->base.codec_type);
+ src_caps = "";
+ gstdec_name = NULL;
+ break;
+ }
+
+ /* decodebin will use vaapi if installed, which for a time could
+ * intentionally crash the application. So only use decodebin as a
+ * fallback or when SPICE_GSTVIDEO_AUTO is set.
+ * See: https://bugs.freedesktop.org/show_bug.cgi?id=90884
+ */
+ if (!gstdec_name || getenv("SPICE_GSTVIDEO_AUTO")) {
+ gstdec_name = "decodebin";
}

/* - We schedule the frame display ourselves so set sync=false on appsink
--
2.8.0.rc3
Loading...