In future manufacturing human-machine interaction will evolve towards flexible and smart collaboration. It will meet requirements from the optimization of assembly processes as well as from motivated and skilled human behavior. Recently, human factors engineering has substantially progressed by means of detailed task analysis. However, there is still a lack in precise measuring cognitive and sensorimotor patterns for the analysis of long-term mental and physical strain. This work presents a novel methodology that enables real-time measurement of cognitive load based on executive function analyses as well as biomechanical strain from non-obtrusive wearable sensors. The methodology works on 3D information recovery of the working cell using a precise stereo measurement device. The worker is equipped with eye tracking glasses and a set of wearable accelerometers. Wireless connectivity transmits the sensor-based data to a nearby PC for monitoring. Data analytics then recovers the 3D geometry of gaze and viewing frustum within the working cell and furthermore extracts the worker's task switching rate as well as a skeleton-based approximation of worker's posture associated with an estimation of biomechanical strain of muscles and joints. First results enhanced by AI-based estimators demonstrate a good match with the results of an activity analysis performed by occupational therapists.