https://venturebeat.com/wp-content/uploads/2020/02/58a0f21e-eead-4670-9c99-097e67da5890-e1581659979759.png?fit=578%2C289&strip=all
Image Credit: PlaySight

PlaySight trained AI on thousands of hours of videos to understand sports

by

Sports analytics, which refers to the use of data and statistics to measure the performance of players (or teams) and make informed coaching decisions, is an enormous market. Grand View Research pegs its worth at $4.6 billion by 2025, expanding at a compound annual growth rate of 31.2% from 2019.

Perhaps unsurprisingly, startups are pursuing it with gusto, and one of the pack leaders is PlaySight. The Tel Aviv-based company keeps a low profile, but it’s raised $26 million in capital since its founding in 2010 (and plans to raise again in the coming months) from SoftBank, Navar Corporation, Verizon Ventures, pro golf legend Greg Norman, and other backers. Moreover, it counts among its customers the NBA’s Boston Celtics, Golden State Warriors, and Toronto Raptors, as well as over 80 other NCAA and NJCAA (National Junior College Athletic Association) programs and the United States Tennis Association’s National Campus.

At the 2020 OurCrowd Global Investor summit in Jerusalem this week, VentureBeat caught up with founder and CEO Chen Shachar to get the skinny on PlaySight’s technology — specifically its use of AI and machine learning.

Live broadcasting

PlaySight works with customers to build what it calls SmartCourts, connected systems consisting of cameras installed around fields, courts, gyms, and rinks. At a high level, the cameras and software layered atop them provide automated live streaming, tagging software, and predictive analytics, as well as audio annotation and drawing tools for positioning.

https://venturebeat.com/wp-content/uploads/2020/02/monetization542.jpg?w=800&resize=800%2C510&strip=all
https://venturebeat.com/wp-content/uploads/2020/02/monetization542.jpg?w=800&resize=800%2C510&strip=all

The technology has roots in the Israeli military — Shachar and cofounders Evgeni Khazanov and Yoram Bentzur spent years developing weaponry and war simulators. The first SmartCourt was installed in 2014, and the number of installations now stands at “hundreds” of athletic centers, school gymnasiums, clubs, and federations, which use it to set goals and build regimens or choose from drills guided by coaches like Darren Cahill and Paul Annacone.

PlaySight’s eponymous PlaySight Edge service offers multi-angle video recording and ball- and player-tracking with a broadcasting interface that lets users zoom in, play in slow motion, and rewind to the beginning for instant replays. With a mobile app, they’re able to review and share sports video and data, and to create highlights with tagged and bookmarked plays uploaded to PlaySight’s sports network.

AI and machine learning

PlaySight currently supports over 30 sports, including baseball, volleyball, swimming, lacrosse, gymnastics, dancing, wrestling, and cricket. While analytics isn’t available for all of them, sports for which it is tap AI models trained on thousands of hours of video recordings.

PlaySight refers to its family of models as SmartTracker, and Shachar says they improve each game by learning the most efficient ways to track the action. Off-the-shelf cameras supplied by PlaySight capture the footage they ingest and analyze, but SmartTracker also works with existing cameras (from a single camera up to 10) that meet certain baseline requirements. There’s also a portable solution courtesy of a partnership with LiveU that can stream and record video without the need for cabling.

Subscribers to PlaySight’s SmartCourt Pro tennis plan see serves and strokes tracked and tagged by type, speed, spin, and more. Additionally, they get an overview of performance with a 3D shot map, which breaks down points with automatically generated analytics, percentages, and performance data.

https://venturebeat.com/wp-content/uploads/2020/02/Playsight0.2-1024x576-1.jpeg?w=800&resize=800%2C450&strip=all
https://venturebeat.com/wp-content/uploads/2020/02/Playsight0.2-1024x576-1.jpeg?w=800&resize=800%2C450&strip=all

Adding a new sport can take months to roughly a year, depending on the complexity of the said sport. According to Shachar, basics like player tracking are easy enough — PlaySight’s pretrained models generalize well to new playing fields and rules of play. But building something more bespoke, like a soccer model that keeps tabs on goal scoring and incurred penalties, requires more development time and effort.

And some tasks — like tracking pucks during a hockey match, when struck pucks can reach upwards of 109.2 miles per hour — are beyond the PlaySight platform’s capabilities. It’s not so much the models that are the limiting factor, but the camera’s (or cameras’) frame rate. “It’s feasible to do it with higher-speed cameras, but if you’re trying to achieve a mass-market solution, it [doesn’t make sense],” said Shachar. “We’re trying to reduce the number of cameras [required] to reduce the cost of the system.”

Competition

PlaySight is far from the only startup vying for share of the lucrative AI sports analytics segment, it’s worth noting.

Keemotion — whose customers include Columbia’s volleyball team, professional soccer leagues, NBA teams, and several NCAA Division I athletic programs — captures footage from multiple cameras to autonomously track balls and adjust the center of the viewport during the most exciting moments. Like PlaySight, it annotates recorded footage and streams it to handheld tablets, which coaches and support staff can use to view replays and home in on specific points of the floor.

As for PlaySight’s fellow Tel Aviv-based computer vision startup Minute.ly, it offers a real-time software analysis tool that automatically divvies up livestreams into attention-grabbing, heart-thumping five-to-seven-second clips. Meanwhile, sports tech company Hudl supports video analysis workflows for youth, high school, club, and professional teams. Startup Stats captures up to 2,700 data points per game analyzed, tapping machine learning algorithms to tally up the distance players run, the trajectory and speed of balls, and teammates’ touches.

https://venturebeat.com/wp-content/uploads/2020/02/bball-1.jpg?w=792&resize=792%2C600&strip=all
https://venturebeat.com/wp-content/uploads/2020/02/bball-1.jpg?w=792&resize=792%2C600&strip=all

Elsewhere, for the past several years, IBM has tapped its Watson AI service to derive highlights for U.S. Open tennis matches, taking into account crowd noises, emotional player reactions, and other factors. A platform developed by Fujitsu uses sensors to track the movements of gymnasts and to analyze those movements using AI that measures skeletal positions, speeds, and various angles. On the academic side, researchers at the Queensland University of Technology proposed last year an AI system that’s not only capable of anticipating a tennis opponent’s actions but doing so with “player-level” behavioral patterns.

But in contrast to some of its competitors, Shachar says that PlaySight’s focus has been — and remains — keeping product and service pricing low. Indeed, over 20 high schools, colleges, and universities have installed PlaySight’s technology across several sports and venues, including ice hockey and soccer.

“We’re [working] hard to develop technology that can democratize sports analytics,” he said. “In the next three years, most sports facilities or the majority of sports facilities are going to be smart — they’re going to be connected. And the AI will just get better and better.”