The further embedding of immigration checks into UK public sector institutions have made them key sites of bordering. Information systems for datafication, established to enable the reporting and sharing data between these institutions and the UK Home Office, have become emerging sites for opposition to the UK’s border and immigration regime. In this paper, I will highlight the ways in which ‘everyday borderworkers’ in hospitals and higher education have practiced forms of refusal that have undermined these information systems and made care and education accessible to patients and students. However, such ‘data activism’ and the ‘un/bordering’ it enables is under threat from the expansion of machine learning into state bordering practices and processes or what Louise Amoore has called ‘the deep border’. Just as for some states almost every mundane space is becoming a potential site of bordering, so too computer science appears to be rendering all spaces as ‘feature spaces’. A feature is a set of attributes associated to an example and is generated by the examples the algorithm is exposed to. The algorithm still generates the feature, whether data is withheld or not. It uses the examples that are there. Clustering algorithms, Amoore argues, not only becomes a way for imagining and grouping people, places and even countries but also for inferring the behaviours and attributes of this group. I will argue that the expansion of the deep border into bordering public sector institutions will render current forms of data activism to deborder these institutions obsolete.