The level of distribution : the distribution of tensorflow is achieved at Graph level, facilitated by subgraph execution of tensorflow. The component of tensorflow Graph (Tensor/Variable/Operation) can not be distributed. While Spark’s distribution is achieved at RDD level which is the base of Spark. That it to say all the RDD operation and computational graph that is built on RDDs are distributed. tensorflow supports asynchronous training : Asynchronous training is supported naturally by concurrent execution of replicated subgraphs. In addition, synchronous training is also possible in distributed tensorflow. While Spark only supports synchronous computation, since Spark follows Bulk Synchronous Parallel(BSP) model. Therefore, asynchronous training in SparkMLlib hardly happens. tensorflow supports parameter-server & worker structure: in distributed tensorflow user can assign a device with either ps task or worker task. I think this feature is...