Following is a quote from AVS, and then my theory...
"Dynamic metadata streams independently along with the data stream from internal apps. Sony then maps this metadata to the display. But, HDMI < 2.1 cannot pass dynamic metadata. Dolby has gotten around that limitation by embedding the dynamic metadata into the video signal when passed via HDMI. In this case, the display must extract the metadata from the video signal before it can process it. Either Sony TVs cannot perform this process or Dolby has introduced a new method for Sony to be the first to use."
How does this tinfoil hat theory sound?
I could go with Sony tv's cannot perform the process currently, because MediaTek just launched their DV Android TV chips, and Sony uses off the shelf solution AFAIK. LG has had a designed with Dolby custom SoC for 2016 and 2017.
My guess is, when DV and Sony decided at CES 2017 they would install DV on the 2017 sets, Dolby, Sony and MT had to get together and figure it out, which is where the UBP-X700 comes in. Now, companies being companies, none of them shared this little tidbit with Oppo, and the other MediaTek customers until the bomb was dropped, now Oppo has to go out and figure it out, and then get recertified.
Makes me wonder if the UBP-X700 won't be the first DV player to work with Sony tv's, and that was by design...
Sony, using off the shelf parts, won't have the MediaTek Android TV DV chip until this year, prob why they punted on this years models right now. I think this low latency thing was a stop gap to allow their 2016 and 2017 models have it, and that will come from the UBP-X700 and the Oppo 203 when it gets up to speed.