Switchboard: A Matchmaking System for Multiplayer Mobile Games

4 downloads 18779 Views 11MB Size Report
Breakthrough of Mobile Gaming. 2. Windows Phone 7. Top 10+ apps are games. John Carmack (Wolfenstein 3D, Doom, Quake)… “multiplayer in some form is ...
Battling demons and vampires on your lunch break…

Switchboard: A Matchmaking System for Multiplayer Mobile Games Justin Manweiler, Sharad Agarwal, Ming Zhang, Romit Roy Choudhury, Paramvir Bahl ACM MobiSys 2011

Breakthrough of Mobile Gaming iPhone
App
Store
 350K
applica,ons
 47%
Time
on

 20%
apps,
80%
downloads


Windows
Phone
7
 Top
10+
apps
are
games


Mobile
Apps
 Spent
Gaming


John Carmack (Wolfenstein 3D, Doom, Quake)… “multiplayer in some form is where the breakthrough, platform-defining things are going to happen in the mobile space” 2


Mobile Games: Now and Tomorrow Increasing
Interac6vity


Single‐player
 Mobile
 
 (mobile
today)



Mul6player

 Turn‐based
 
 (mobile
today)

 


Mul6player

 Fast‐ac6on
 
 (mobile
soon)


3


Key Challenge Bandwidth
is
fine:
250
kbps

 to
host
16‐player
Halo
3
game


Game
Type

Latency
Threshold

Delay
bounds
are

 much
6ghter


First‐person,
Racing

≈
100
ms

Sports,
Role‐playing

≈
500
ms

Challenge:
find
groups
of
peers
 than
can
play
well
together


Real‐,me
Strategy

≈
1000
ms

4


The Matchmaking Problem Connection Latency


End-to-end Latency Threshold


Match
to
sa+sfy
total
 delay
bounds


Clients


5


Instability in a Static Environment Median
Latency
(ms)


310
 290
 270
 250


Due
to
instability,
 must
consider
latency
distribu6on


230
 210
 190
 170
 150
 9:36
 9:50
 10:04
 10:19
 10:33
 10:48
 11:02
 11:16
 11:31
 11:45
 12:00


Time
of
Day
(AM)


6


End-to-end Latency over 3G First‐person
Shoot.


1


Racing


Sports


Real‐6me
Strategy


Empirical
CDF


0.8


Peer‐to‐peer
reduces
latency

 and
is

cost‐effec6ve


0.6


AT&T
to
AT&T
Direct
 AT&T
to
AT&T
via
Bing
 AT&T
to
AT&T
via
Duke
 AT&T
to
AT&T
via
UW


0.4
 0.2
 0
 0


100


200


300


RTT
(in
ms)


400


500


600
 7


The Matchmaking Problem Targe,ng
3G:
 play
anywhere


Link
Performance


Latency
not
Bandwidth


interac+vity
is
key


Grouping


Measurement
/
PredicTon


at
game
+mescales


P2P
Scalability
 8


Requirements for 3G Matchmaking ●  Latency estimation has to be accurate   Or
games
will
be
unplayable
/
fail
 ●  Grouping has to be fast   Or
impa,ent
users
will
give
up
before
a
game
is
ini,ated
 ●  Matchmaking has to be scalable   For
game
servers
   For
the
cellular
network
   For
user
mobile
devices


9


State of the Art ●  Latency estimation   Pyxida,
stable
network
coordinates;
Ledlie
et
al.
[NSDI
07]
   Vivaldi,
distributed
latency
est.;
Dabek
et
al.
[SIGCOMM
04]
 Latency
es,ma,on
and
matchmaking
are

 ●  Game matchmaking for wired networks established
for
wired
networks
   Htrae,
game
matchmaking
in
wired
networks;
 Agarwal
et
al.
[SIGCOMM
09]
 ●  General 3G network performance   3GTest
w/
30K
users;
Huang
et
al.
[MobiSys
2010]
   Interac,ons
with
applica,ons;
Liu
et
al.
[MobiCom
08]
   Empirical
3G
performance;
Tan
et
al.
[InfoCom
07]
   TCP/IP
over
3G;
Chan
&
Ramjee

[MobiCom
02]
 10


A “Black Box” for Game Developers IP
network


Internet
 GGSN


“Black

 Box”


GGSN


SGSN


SGSN


RNC


RNC


End‐to‐end
 Performance


Crowdsourced
 Link
Performance
 Measurement
 (over
6me)
 11


Crowdsourcing 3G over Time Latency Similarity by Time


Time
 12


Crowdsourcing 3G over Space

Latency Similarity by Distance
 13


Can we crowdsource HSDPA 3G? ●  How does 3G performance vary over time?   How
quickly
do
old
measurements
“expire”?
   How
many
measurements
needed
to
characterize
the
 latency
distribu,on?
 Details
of
parameter
space
lem
for
the
paper

   …
 (our
goal
is
not
to
iden6fy
the
exact
causes)
 ●  How does 3G performance vary over space?   Signal
strength?
Mobility
speed?

   Phones
under
same
cell
tower?
   Same
part
of
the
cellular
network?
   …
 14


Methodology ●  Platform   Windows
Mobile
and
Android
phones
   HSDPA
3G
on
AT&T
and
T‐Mobile
 ●  Carefully deployed phones   Con,nuous
measurements
   Simultaneous,
synchronized
traces
at
mul,ple
sites
 ●  Several locations   Princeville,
Hawaii
   Redmond
and
Seanle,
Washington
   Durham
and
Raleigh,
North
Carolina
   Los
Angeles,
California
 15


Stability over Time (in a Static Environment) Redmond,
AT&T,
15m
Intervals
 1


Empirical
CDF


0.8
 0.6
 0.4


Live
characteriza,on
is
necessary
and
is
feasible
 Performance
drims
over
 longer
,me
periods


Black
line
represents
phone
1

 (all
other
lines
phone
2)


0.2
 0
 120


Similar
latencies
under
 the
same
tower


140


160


180


200


RTT
(Msec)


220


240
 16


Stability over Space (at the same time) Similarity
at
the
 same
cell
tower


Substan,al
 varia6on


Divergence
between
 nearby
towers


1


Empirical
CDF


0.8
 0.6
 S‐home


0.4


Latona
 U
Village


0.2


Herkimer
 Northgate
 1st
Ave


0
 0


50


100


150


RTT
difference
at
90th
percenTle
(ms)


200
 17


Switchboard: crowdsourced matchmaking 
 Game
 Network
TesTng
 Service


Switchboard
Cloud
Service
 Grouping
Agent


on
MSFT
Azure


Latency
 Data
 Measurement
 Controller


Latency
 EsTmator


18


Scalability through Reuse… ●  Across Time   Stable
distribu,on
over
15‐minute
,me
intervals
 ●  Across Space   Phones
can
share
probing
tasks
equitably
for
each
tower
 ●  Across Games   Shared
cloud
service
for
any
interac,ve
game


19


Client Matchmaking Delay 1


Empirical
CDF


0.8
 0.6


Switchboard
clients
benefit
 from
deployment
at
scale


0.4


1
client
arrival/sec
 Total
1
client/sec
 0.2


10
client
arrival/sec
 Total
10
clients/s


0
 0


100


200


300


400


500


Time
unTl
placed
in
group
(s)


600


700
 20


Conclusion ●  Latency: key challenge for fast-action multiplayer ●  3G latency variability makes prediction hard ●  Crowdsourcing enables scalable 3G latency estimation ●  Switchboard: crowdsourced matchmaking for 3G

21


k
you!


cs.duke.edu/~jgm
 [email protected]