Contextuality from missing and versioned data

Foundations of physics and/or philosophy of physics, and in particular, posts on unresolved or controversial issues

Contextuality from missing and versioned data

Postby minkwe » Thu Nov 09, 2017 12:05 pm

Jason Morton
Traditionally categorical data analysis (e.g. generalized linear models) works with simple, flat datasets akin to a single table in a database with no notion of missing data or conflicting versions. In contrast, modern data analysis must deal with distributed databases with many partial local tables that need not always agree. The computational agents tabulating these tables are spatially separated, with binding speed-of-light constraints and data arriving too rapidly for these distributed views ever to be fully informed and globally consistent. Contextuality is a mathematical property which describes a kind of inconsistency arising in quantum mechanics (e.g. in Bell's theorem). In this paper we show how contextuality can arise in common data collection scenarios, including missing data and versioning (as in low-latency distributed databases employing snapshot isolation). In the companion paper, we develop statistical models adapted to this regime.

Fascinating counter-example to Bell's theorem from the most unexpected of places.
Posts: 987
Joined: Sat Feb 08, 2014 9:22 am

Return to Sci.Physics.Foundations

Who is online

Users browsing this forum: No registered users and 5 guests

CodeCogs - An Open Source Scientific Library