Big data is coming to the octagon. The Nevada State Athletic Commission today approved a test run for a new UFC-approved product that uses analytics and artificial intelligence to tell the story of a fight. At UFC 219, select fighters will be equipped with glove sensors that measure everything from punch strength to stress. The data from those sensors will then be interpreted in real time using analytics and artificial intelligence. The UFC’s hope is to give fight fans a better understanding of what’s really happening inside the octagon with a new set of stats that can be shown during broadcasts. But first the promotion needs to make sure the data is accurate and useful. Today during an NSAC hearing in Las Vegas, the UFC pitched the program as a way to potentially improve fighter safety, including concussion protocols and training methods. The commission was optimistic about its potential, though concerns were expressed about how the data is stored and used. NSAC Chairman Anthony Marnell likened the program to the use of Sabermetrics in baseball but cautioned the data could sway judges if displayed on their cageside TV monitors during a fight. He said the UFC should work behind the scenes with the commission to determine the best rollout. Before the initial findings are released, the commission also wants to sign off.Commissioner Raymond “Skip” Avansino noted that previous attempts to place sensors on fighters resulted in data being distributed without permission. The collaboration that resulted in the new product came together in 2016, when Endeavor CEO and UFC co-owner Ari Emanuel brokered a deal between analytics company AGT International and consumer platform company HEED. A live demonstration took place this past month at the tech conference “AWS re:Invent 2017.”After a mock sparring session between UFC fighters Edson Barboza and Mark Diakiese, HEED co-founder Mati Kochavi said the sensors used to collect data produce 70 new insights about what happens during a fight. The sensors on the gloves alone, he said, produce 12 different “stories.” There are sensors in the octagon canvas to measure movement and range, and even sensors monitoring a fighter’s cornermen and his family members. Kochavi said all of the data is interpreted by a highly complex “AI agent,” which works from a detailed “world graph” of different data points. They include the attributes a fighter displays in the cage such as his style, emotion, and energy, as well as the surrounding environment made up of the fans, referee and media. The AI agent can beam all that information to fans via smartphone, and fans can tailor the information they want to see based on their preference for particular stats. “Those insights are covering entire aspects of the fight between Diakiese and Barboza,” Kochavi said. “They cover their passion, the power of the fight, the resiliency, the strategy. All of those things happen in the octagon. “Shouldn’t we tell the story of sport that way? Shouldn’t sport be told in real time, with real data, with real information, and with real insights, and the real emotions? We are a company which is trying to revolutionize the way we’re going to (broadcast) sports and live events.” Fans won’t immediately see the changes when they watch UFC 219, which takes place Dec. 30 at T-Mobile Arena in Las Vegas. But every time they see a punch, a new line of data will get collected. What becomes of that data is the next big question.For more on UFC 219, check out the UFC Rumors section of the site.