Worksheet Film : Humans in the Loop (2024)

HUMANS IN THE LOOP (2024) 

This blog is part of Sunday reading assigned by Dilip Barad to analyse Humans in the Loop deeply, Also to explore AI, Bias, and Epistemic Representation, Labour and the Politics of Cinematic Visibility and Film Form, Structure, and Digital Culture. Worksheet for Task


🎬 PRE-VIEWING WORKSHEET: CONTEXT & KEY CONCEPTS


🔹 AI Bias & Indigenous Knowledge Systems 

  • AI bias refers to systematic distortions in algorithmic outputs that arise from the data, categories, and assumptions embedded in machine learning systems. Rather than being neutral computational errors, biases often reflect historical inequalities, dominant cultural norms, and selective representation within datasets.

  • Machine learning systems depend on classification. However, classification is never ideologically neutral; it simplifies complex realities into fixed, standardized categories. These categories are shaped by designers, institutions, and economic priorities, thereby embedding cultural assumptions into technical infrastructures.

  • Indigenous ecological knowledge systems operate differently. They emphasize relationality, seasonal rhythms, oral transmission, and context-dependent understanding. Knowledge is experiential and collective rather than abstracted into universal taxonomies.

  • When such situated knowledge encounters rigid algorithmic structures, tension emerges. Indigenous frameworks resist reduction because they are grounded in lived interaction with land, community, and environment.

  • Thus, indigenous epistemologies challenge technological framings by exposing the limits of computational universality. They reveal that intelligence is plural and culturally situated, not singular or purely mathematical.


🔹 Labour & Digital Economies 

  • Invisible labour in digital economies refers to forms of human work that sustain technological systems but remain socially and economically obscured. In AI production, this includes data annotation, content moderation, verification, and correction.

  • Although AI is frequently described as autonomous or self-learning, machine learning models require continuous human intervention. Workers classify images, interpret language, and refine datasets so algorithms can function effectively.

  • This labour is often outsourced, precarious, and geographically marginalized. It operates within global digital capitalism, where value accumulates at the top of technological hierarchies while cognitive effort remains under-recognized.

  • Highlighting invisible labour is politically significant because it disrupts the myth of automation. It reveals that so-called artificial intelligence is dependent on human judgment.

  • Economically, invisible labour transforms human cognition into scalable data capital. Culturally, its invisibility reinforces assumptions that innovation is detached from embodied work. Bringing such labour into narrative focus exposes structural inequalities within contemporary digital economies.


🔹 Politics of Representation 

  • Representation in cinema is not mere depiction but the construction of meaning through framing, selection, and narrative emphasis. Media shapes how audiences understand both technology and marginalized identities.

  • Public discourse often portrays AI as progressive, objective, and future-oriented. Conversely, Adivasi communities are frequently framed within developmental narratives as traditional or outside modernity. This contrast reinforces hierarchical binaries between technological modernity and indigenous life.

  • By centering an Adivasi woman within an AI context, the film’s publicity and reviews suggest a disruption of this binary. Technology and indigenous identity are placed in dialogue rather than opposition.

  • Representation thus becomes ideological: it determines whose knowledge is seen as innovative and whose as residual. If indigenous experience is framed as intellectually engaged rather than technologically excluded, dominant stereotypes are challenged.

  • The politics of representation therefore operates at two levels—depicting technology not as neutral infrastructure, and depicting Adivasi culture not as static tradition, but as an active participant in contemporary knowledge systems.

📖 POST-VIEWING REFLECTIVE ESSAY

TASK 1 — AI, Bias & Epistemic Representation


AI, Bias, and the Politics of Knowledge in Humans in the Loop


Artificial intelligence is frequently presented as neutral computation—objective, mathematical, and detached from social context. Humans in the Loop, directed by Aranya Sahay, challenges this assumption by representing AI as culturally produced and ideologically structured. The film argues that algorithmic systems do not merely process data; they inherit the assumptions, hierarchies, and exclusions embedded within the societies that design and sustain them. Through its focus on Nehma, an Adivasi woman engaged in data labelling work in Jharkhand, the film exposes algorithmic bias as socially situated and reveals the epistemic hierarchies that determine whose knowledge counts within technological systems.


The narrative foregrounds the human infrastructure behind machine learning. Rather than portraying AI as autonomous intelligence, the film repeatedly shows Nehma performing classification tasks—drawing bounding boxes, assigning labels, and verifying categories. These acts reveal that AI learning is dependent on human interpretation. The so-called “learning” of the machine is a structured repetition of human judgment. By situating the camera within the workspace, the film dismantles the myth of technical neutrality. AI emerges not as an independent entity but as a system shaped by selective data and predefined categories.


Algorithmic bias is presented as structurally embedded rather than accidental. When Nehma labels images according to rigid taxonomies, the film highlights the reduction inherent in computational classification. Complex social identities and ecological realities are compressed into singular tags such as “professional,” “normal,” or “violent.” The repetition of bounding boxes visually reinforces this reduction. The screen fragments lived experience into measurable units, illustrating how algorithmic systems simplify multiplicity into standardized data points. Bias thus appears as a design consequence: it arises from the limitations and assumptions built into the classificatory framework itself.


The film further demonstrates that such classifications are culturally situated. Nehma’s indigenous ecological knowledge—rooted in relational understanding of land, seasonality, and community—cannot easily be translated into fixed digital categories. Her pauses and hesitations while labelling forest imagery signal a disjunction between lived knowledge and algorithmic logic. What she understands contextually must be reformulated into abstract, decontextualized inputs. This tension reveals that AI systems privilege particular epistemologies—often standardized, Western, and market-oriented—while marginalizing others. Bias therefore reflects the dominance of one knowledge system over another.

This dynamic illustrates epistemic hierarchy. The authority to define categories lies with distant clients and designers, not with those performing interpretive labour. Nehma contributes her cognitive effort to shaping the dataset, yet she does not control the conceptual framework guiding classification. Her knowledge is instrumentalized but not recognized as epistemically authoritative. The film thereby exposes what scholars term epistemic injustice: the systematic devaluation of certain knowers within institutional structures. Indigenous knowledge becomes raw material for machine training but is denied legitimacy as knowledge in its own right.

Apparatus Theory offers a useful framework for interpreting this critique. Traditionally associated with the ideological operations of cinema, Apparatus Theory argues that film positions spectators within structured systems of meaning that appear natural but are socially constructed. In Humans in the Loop, the AI interface functions analogously to a cinematic apparatus. It organizes perception through framing, bounding, and categorization. Just as the cinematic frame directs the viewer’s gaze, the algorithmic interface directs machine perception. Both systems produce meaning by delimiting what can be seen and how it can be interpreted. By foregrounding the interface rather than concealing it, the film reveals this structuring power. The ideological function of technology becomes visible rather than naturalized.

Representation plays a central role in this exposure. The film does not depict Nehma as technologically deficient or culturally static. Instead, it presents her as intellectually reflective and ethically aware. This representation disrupts dominant media narratives that frame Adivasi communities as outside modern technological processes. By positioning her at the centre of AI production, the film challenges the binary between tradition and modernity. The narrative suggests that indigenous identity and technological labour coexist within contemporary digital culture, complicating simplistic developmental hierarchies.

At the same time, the film avoids romanticizing indigeneity. Nehma’s knowledge does not automatically resolve the contradictions of machine learning. Rather, her situated understanding exposes the limits of universal classification. The forest imagery intercut with screen-based labour reinforces this contrast. Natural spaces are depicted with depth and texture, emphasizing relational complexity. In contrast, the digital interface appears flat and segmented. This formal juxtaposition underscores the epistemic tension between contextual knowledge and algorithmic abstraction.


Power relations remain central to the film’s argument. The unseen clients who define the categories embody structural authority. Their absence from the frame intensifies the asymmetry: control is exercised through data pipelines rather than physical presence. The labourer sees the interface but not the institutional decision-makers shaping it. This invisibility mirrors broader dynamics within digital capitalism, where those who design systems remain detached from those who execute micro-tasks. Algorithmic bias is therefore not only cultural but economic; it reflects hierarchies embedded within global technological production.


The metaphor of the “human in the loop” operates beyond technical terminology. In engineering discourse, the phrase refers to systems requiring human oversight. In the film, it acquires political resonance. Humans are necessary for AI training, yet they lack decision-making power. The loop suggests continuity, but it does not imply equality. Nehma’s participation sustains the system, but her epistemic authority remains constrained. The film thus reframes the loop as a site of asymmetrical dependency rather than collaborative co-creation.

Importantly, the narrative refrains from offering technological solutions. There is no suggestion that better coding alone can eliminate bias. Instead, the film situates bias within social structures. As long as datasets reflect unequal representation and categories privilege dominant frameworks, algorithmic outputs will reproduce those hierarchies. The absence of narrative closure reinforces this argument. Structural problems cannot be resolved through individual intervention.

Cinematically, the restrained style supports this critique. Close-ups of Nehma’s concentrated gaze emphasize the cognitive labour behind machine learning. The rhythmic clicking of the interface contrasts with the layered sounds of the forest, symbolizing the narrowing of perception within digital systems. Editing connects micro-actions—such as a single mouse click—to broader technological consequences, suggesting that large-scale AI infrastructures are built from countless small judgments. Form and argument align: the film’s aesthetic choices render visible what digital discourse obscures.

Ultimately, Humans in the Loop positions artificial intelligence as a mirror of societal structures rather than an autonomous force. Algorithmic bias is revealed as culturally situated because it emerges from selective epistemologies embedded within data and design. Epistemic hierarchies become visible through the unequal distribution of authority between those who classify and those who define categories. By foregrounding indigenous knowledge without romanticization, the film challenges the universality claimed by technological systems and insists on the plurality of intelligence.

Through its narrative focus and formal strategies, the film transforms AI from a symbol of futuristic innovation into a site of contemporary ideological struggle. Technology does not transcend culture; it is shaped by it. The machine learns what it is taught, and what it is taught reflects power relations. In exposing this dynamic, Humans in the Loop compels viewers to reconsider not only how artificial intelligence functions, but whose knowledge it encodes and whose it leaves outside the frame.

📖 POST-VIEWING REFLECTIVE ESSAY

TASK 2 — Labour & the Politics of Cinematic Visibility


Invisible Labour and Digital Capitalism in Humans in the Loop

Contemporary discourse surrounding artificial intelligence frequently emphasizes automation, efficiency, and technological self-sufficiency. In such narratives, human involvement appears minimal, peripheral, or obsolete. Humans in the Loop, directed by Aranya Sahay, disrupts this mythology by foregrounding the human labour that sustains machine learning systems. Through its sustained attention to Nehma’s daily routine as a data labeller in Jharkhand, the film renders visible the forms of cognitive and emotional work that remain obscured within digital capitalism. The film argues that AI is not a replacement for labour but a reorganization of labour—one that depends on marginalized workers while concealing their contribution. By employing Marxist film theory and representation studies, the film exposes how cinematic visibility becomes a political intervention into structures of exploitation.

The central achievement of the film lies in its visualization of invisible digital labour. Data labelling is repetitive, fragmented, and micro-task oriented. Nehma draws bounding boxes, assigns categories, and verifies annotations for extended hours. These actions are neither glamorous nor innovative in appearance. The camera frequently frames her within static compositions that emphasize monotony. Rows of computers, dim lighting, and confined workspaces communicate a sense of standardization. The visual repetition mirrors the repetitive logic of algorithmic classification. Through this aesthetic choice, the film challenges the rhetoric of intelligent automation by revealing the embodied effort beneath it.

Marxist film theory provides a critical lens for interpreting these representations. Under capitalism, labour is often alienated: workers are separated from the products of their work and from decision-making authority. Nehma participates in training AI systems that may operate globally, yet she has no connection to their final application. Her work is detached from visible outcomes. The interface mediates her labour, fragmenting it into isolated tasks that contribute to a larger system she does not control. This separation reflects alienation in digital form. The product appears autonomous, while the labour that produced it remains concealed.

The film also gestures toward Marx’s concept of commodity fetishism. In capitalist economies, commodities appear independent of the labour embedded within them. AI systems are marketed as seamless and self-learning technologies. Consumers interact with virtual assistants, recommendation engines, or automated tools without awareness of the human annotation that enables them. Humans in the Loop counters this fetishism by restoring the visibility of labour. Editing techniques connect Nehma’s small gestures—clicking, dragging, selecting—to broader technological processes. Through subtle match cuts and temporal continuity, the film suggests causality between micro-actions and macro-systems. What appears automated is revealed as accumulated human judgment.

Representation studies further illuminate the politics of visibility at work. Digital labour often occurs in the Global South while technological capital concentrates in the Global North. Although the film does not sensationalize this disparity, it implies structural imbalance through spatial framing. The clients and designers remain absent, existing only through instructions delivered via the interface. Authority is disembodied yet omnipresent. In contrast, the labourer’s body is continuously visible. This asymmetry highlights how recognition and control are unevenly distributed within global digital economies.

Beyond cognitive labour, the film foregrounds emotional labour. Nehma’s task requires interpretive decisions that sometimes conflict with her lived understanding. She must conform to externally defined categories even when they seem reductive. Close-ups of her eyes and facial expressions capture concentration, fatigue, and occasional hesitation. These moments reveal that data annotation is not mechanical input but sustained judgment. Emotional labour operates when workers manage internal responses to align with institutional expectations. Nehma suppresses doubt in order to maintain workflow, illustrating how affective regulation becomes part of technological production.

The film does not portray Nehma as a passive victim. Instead, it presents her as a thinking subject navigating structural constraints. By depicting her family life alongside workplace scenes, the narrative situates labour within broader social realities. Domestic responsibilities and economic necessity contextualize her participation in digital capitalism. This narrative strategy humanizes labour without reducing it to sentimentality. Empathy arises not through melodrama but through attention to everyday routine.

Cinematic form reinforces the critique. The mise-en-scène of the workspace is characterized by rigid lines and artificial illumination. The glow of computer screens dominates the frame, flattening depth and emphasizing enclosure. In contrast, scenes set in natural environments are shot with greater spatial openness and dynamic movement. This visual contrast underscores the transformation of labour from embodied engagement with environment to abstract interaction with interfaces. The difference in spatial texture symbolizes the abstraction central to digital economies.

Sound design intensifies this effect. The repetitive clicking of keyboards and low electronic hums create an acoustic environment distinct from the layered sounds of the forest. Dialogue is often minimal within the data centre, foregrounding mechanical rhythm over human conversation. The sonic landscape conveys isolation and monotony. Through auditory means, the film communicates the experiential dimension of labour—the sense of immersion within a system governed by algorithmic logic.

The politics of cinematic visibility extend beyond representation toward critique. By centering a marginalized Adivasi woman within technological production, the film disrupts assumptions about who contributes to innovation. Public discourse often associates AI with engineers, urban technologists, or corporate leaders. By contrast, Humans in the Loop reassigns visibility to those performing foundational tasks. This repositioning challenges cultural hierarchies that equate intellectual labour with elite spaces while obscuring distributed cognitive work.

The film invites both empathy and structural awareness. Empathy emerges through intimate framing of Nehma’s daily life. Structural awareness arises through repetition and formal restraint. There is no dramatic confrontation or overt protest. Instead, the critique unfolds through accumulation. The monotony itself becomes argument. By refusing spectacle, the film mirrors the invisibility it seeks to contest. The viewer must attend to what is ordinarily overlooked.

Importantly, the film does not propose simple solutions. It does not romanticize digital inclusion nor condemn technology outright. Rather, it exposes contradictions. AI depends on human labour yet is marketed as labour-saving. It promises efficiency while relying on cognitive intensity. It generates capital while distributing recognition unevenly. These tensions remain unresolved, reinforcing the structural nature of the problem.

Within cultural film theory, visibility is a form of power. To render labour visible is to contest its marginalization. By documenting annotation work in detail, the film performs an act of recuperation. It restores narrative weight to micro-tasks typically excluded from technological storytelling. In doing so, it reframes AI as a collective production shaped by economic hierarchies rather than isolated innovation.

The title itself encapsulates this argument. “Human in the loop” suggests technical oversight within automated systems. The film transforms this technical phrase into a political metaphor. Humans are indispensable to AI, yet their indispensability does not translate into authority. The loop signifies dependency without equality. Labour sustains the system but remains structurally subordinate.

Humans in the Loop reveals that digital capitalism reorganizes rather than eliminates labour. The invisibility of annotation work is not incidental; it is constitutive of technological spectacle. By employing cinematic form to foreground embodied effort, the film challenges viewers to reconsider the narratives surrounding artificial intelligence. Labour does not disappear in the age of AI. It becomes fragmented, distributed, and obscured. Through careful representation and formal restraint, the film restores visibility to that obscured labour and situates technological progress within the political economy that sustains it.

📖 POST-VIEWING REFLECTIVE ESSAY

TASK 3 — Film Form, Structure & Digital Culture


Film Form and the Aesthetics of Digital Culture in Humans in the Loop

While Humans in the Loop, directed by Aranya Sahay, engages critically with artificial intelligence, its philosophical argument is conveyed as much through film form as through narrative content. The film does not rely on expository explanation to critique digital culture. Instead, it constructs meaning through mise-en-scène, cinematography, editing, and sound. Through formal contrast between natural environments and digital workspaces, the film articulates a broader reflection on abstraction, reduction, and the transformation of human experience under algorithmic systems. A structuralist and formalist approach reveals how cinematic devices operate as systems of signification that parallel the film’s thematic concerns.

A central formal strategy in the film is the juxtaposition of two visual worlds: the organic landscape of Jharkhand and the enclosed digital workspace of the data-labelling centre. Natural spaces are filmed with textured depth, layered framing, and ambient lighting. The camera often remains attentive to environmental detail—leaves, soil, breath, distance—suggesting relationality and spatial continuity. These sequences emphasize embodiment and contextual awareness. In contrast, the data-labelling environment is marked by artificial light, flat composition, and constrained spatial design. Screens dominate the frame, often isolating Nehma within rigid boundaries. This opposition functions structurally as a binary code: organic versus digital, fluid versus categorical, relational versus segmented. Through these coded oppositions, the film communicates its critique of computational abstraction.

From a structuralist perspective, meaning emerges through difference. The forest does not merely serve as backdrop; it signifies multiplicity and context. The digital interface, by contrast, signifies reduction and quantification. The repeated visual motif of bounding boxes intensifies this symbolism. Each box encloses an object or face within measurable parameters. Cinematically, this graphic overlay fragments the frame into units, echoing the classificatory logic of machine learning. The bounding box becomes a signifier of epistemic reduction—transforming lived complexity into analysable data.

Cinematography reinforces this symbolic system. In natural scenes, the camera exhibits relative mobility, subtly adjusting perspective in response to movement. This mobility suggests perceptual openness. In the data centre, however, framing becomes more static and frontal. The repetition of similar angles across sequences produces visual monotony. This rigidity mirrors the repetitive logic of algorithmic processes. The spectator experiences a narrowing of visual dynamism within digital space, paralleling the narrowing of meaning within classification systems.

Editing patterns further articulate this contrast. Cross-cutting between forest imagery and annotation work creates an intellectual juxtaposition. A moment of ecological immersion is followed by the segmentation of that environment into labelled categories. This editing strategy functions as conceptual montage, encouraging viewers to recognize the gap between lived knowledge and digital representation. The transition from organic continuity to digital fragmentation is not neutral; it carries argumentative force. By placing these images in sequence, the film constructs a visual thesis about the transformation of knowledge under technological mediation.

Sound design deepens the experiential dimension of this argument. Forest scenes are characterized by layered ambient sounds—wind, birds, distant human activity. These sounds create acoustic depth and environmental presence. In contrast, the data centre is dominated by mechanical clicks, keyboard taps, and low electronic hums. Dialogue is often subdued beneath technological noise. This sonic shift produces a perceptual contraction. The natural world resonates with multiplicity, while the digital environment resonates with repetition. Through auditory means, the film conveys the affective texture of digital labour and abstraction.

Formalist analysis emphasizes that aesthetic choices generate meaning independently of explicit dialogue. In Humans in the Loop, close-ups of Nehma’s face function as focal points of subjectivity. The camera lingers on her gaze as she studies the screen. This visual emphasis creates a feedback loop: the viewer watches Nehma watching the machine. Such framing foregrounds cognitive effort and perceptual strain. The screen reflects light onto her face, symbolizing the inscription of digital logic onto human subjectivity. The image suggests that technological systems shape not only external representation but internal experience.

Sequencing also contributes to the film’s philosophical stance. The narrative unfolds without dramatic escalation or resolution. Repetition structures the temporal rhythm. Daily routines recur with slight variation, producing a cyclical sense of time. This structure mirrors the iterative logic of machine learning, which refines output through repeated input. By aligning narrative temporality with algorithmic repetition, the film embeds its thematic concern within form itself. The viewer experiences duration as process rather than event, reinforcing the film’s emphasis on labour and continuity.

The absence of spectacle is another significant formal decision. Many films about artificial intelligence rely on visual effects or futuristic imagery. Here, the emphasis remains grounded in everyday environments. This aesthetic restraint shifts attention from technological futurism to present social reality. The film’s realism resists sensationalization, encouraging analytical rather than emotional response. Through this restraint, the critique of digital culture becomes more grounded and credible.

Semiotically, the interface functions as a dominant sign. Its visual presence mediates the viewer’s understanding of AI. Rather than portraying complex algorithms, the film focuses on the act of annotation. This focus demystifies artificial intelligence by revealing its reliance on mundane tasks. The interface is not depicted as magical or autonomous; it is shown as dependent on human input. The visual prominence of cursors, bounding tools, and dropdown menus transforms abstract computation into visible procedure.

The interplay between interior and exterior spaces also carries symbolic weight. The data centre appears enclosed and temporally regulated, suggesting industrial organization. The forest appears open and temporally expansive, suggesting continuity beyond institutional structure. This spatial contrast communicates broader concerns about digital culture’s tendency to enclose and quantify experience. The viewer perceives a philosophical tension between environments governed by ecological rhythms and those governed by algorithmic metrics.

Importantly, the film avoids didactic exposition. It does not rely on explanatory voice-over to articulate its critique. Instead, meaning arises through juxtaposition, repetition, and contrast. This reliance on formal devices aligns with formalist narrative theory, which emphasizes that structure itself conveys ideology. The film trusts viewers to infer thematic connections through aesthetic patterning. Such restraint enhances interpretive engagement.

The cumulative effect of these formal strategies is a meditation on digital culture’s reconfiguration of perception. The film suggests that algorithmic systems do not merely categorize external objects; they reshape how humans see and experience the world. By framing labour, identity, and environment through contrasting visual systems, Humans in the Loop dramatizes the transformation of relational knowledge into segmented data.

Thus, film form becomes inseparable from philosophical inquiry. The contrast between natural imagery and digital abstraction is not decorative but argumentative. Cinematic devices operate as conceptual tools, enabling critique through perception. By integrating mise-en-scène, editing, sound, and spatial design into its thematic framework, the film demonstrates how aesthetic structure can interrogate technological ideology. Through formal precision and narrative restraint, Humans in the Loop articulates a sustained reflection on digital culture and human-AI interaction without relying on spectacle or simplification.

No comments:

Post a Comment

Blogs

Rethinking Motherhood in The Joys of Motherhood

Rethinking Motherhood in The Joys of Motherhood : Fulfilment, Burden, and the Question of Choice Introduction The Joys of Motherhood by Buch...

Must Read