Consensus on Dynamic Stochastic Block Models: Fast Convergence and Phase Transitions
We introduce two models of consensus following a majority rule on time-evolving stochastic block models (SBM), in which the network evolution is Markovian or non-Markovian. Under the majority rule, in each round, each agent simultaneously updates his/her opinion according to the majority of his/her neighbors. Our network has a community structure and randomly evolves with time. In contrast to the classic setting, the dynamics is not purely deterministic, and reflects the structure of SBM by resampling the connections at each step, making agents with the same opinion more likely to connect than those with different opinions. In the Markovian model, connections between agents are resampled at each step according to the SBM law and each agent updates his/her opinion via the majority rule. We prove a power-of-one type result, i.e., any initial bias leads to a non-trivial advantage of winning in the end, uniformly in the size of the network. In the non-Markovian model, a connection between two agents is resampled according to the SBM law only when some of the two changes opinion and is otherwise kept the same. We study the phase transition between the fast convergence to the consensus and a halt of the dynamics. Moreover, we establish thresholds of the initial lead for various convergence speeds.
READ FULL TEXT