
Personalized Proprioception: Automated Infill Generation for Gradient Compressibility
Interview with David Burpee & Christoph Brandin
David Burpee, Senior Computational Designer at Brooks Running and founder of his own computational design practice, and Christoph Brandin, Director at Sequence Soles(SEQ) in Denmark, are working at a productive intersection of biomechanics, procedural design, and additive manufacturing.
Their collaboration has produced the Sequence Dynamic Insole Generator, an automated toolchain built in SideFX Houdini that translates human movement data directly into custom G-Code for silicone-printed orthotic insoles, bypassing conventional CAD-to-slicer workflows entirely.
At CDFAM Barcelona, they will present the system’s core architecture, including custom pathfinding algorithms developed to handle print continuity constraints, and a gradient compressibility framework that modulates proprioceptive input across the kinematic chain (squishy in the right places so your body figures out how to walk better).
What distinguishes this work is not only the technical scope but the design problem it addresses: building a computationally sophisticated system that a non-technical operator can use reliably across a wide range of input geometries.
The pipeline integrates biomechanical assessment data, scalar field-driven infill variation, and fabrication constraints into a single interface, treating the insole not as a passive support element but as a precisely tuned sensory interface between the body and the ground.


Can you introduce your work and explain what you’ll be presenting at CDFAM regarding the Sequence Dynamic Insole Generator and how it integrates human biomechanics with computational design?
David: I am a Computational Designer with experience across multiple industries, mostly revolving around performance-driven products. I am currently a Senior Computational Designer at Brooks Running on the Innovation and Elite Athlete team. I am also a lecturer, researcher, and have my own Computational Design practice.
Christoph: I am Director at Sequence (SEQ) in Denmark, where we translate dynamic human movement data into personalized footwear solutions that combine biomechanics, computational design, and advanced materials. I am also a Strategic Impact Advisor at KRING Ventures, supporting impact-driven innovation and venture building in the health and wellbeing space (SDG 3).
The insole generator was born out of a fairly simple problem space – human bodies are unique and off-the-shelf slicer software does not provide the flexibility to respond individually to the human body.
The goal of this project is to create an automated system that can consume multiple data types, and embed biomechanical changes at an algorithmic level, translating directly to G-Code in order to manufacture custom orthotic products at scale. All wrapped up in a single, easy-to-use interface with optimization for silicone printing.
How does your design process translate biomechanical input data into spatial logic for infill variation, and how can you validate its effectivity for different users?
For us, the key challenge is translating dynamic biomechanical data into a spatial material language that the body can actually perceive and respond to.
We see the insole not as a passive support element, but as a sensory interface: pressure distribution, timing, asymmetries, and movement variability can inform where we increase or reduce local shock-absorption, energy-return, or support across the structure.
The goal is not simply correction, but to shape the signals coming from the ground in a way that helps the nervous system organize movement more efficiently.
To validate whether that logic is effective, we combine mechanical lab testing with human movement assessments in scientific studies. That means not only checking material performance and durability, but also testing whether different infill and geometric strategies lead to measurable changes in pressure patterns, balance, gait parameters, comfort, and re-testing the responses across different user profiles.

What parameters are most influential in shaping the proprioceptive behavior of the insole, and how are these tuned within the design environment?
The most influential parameters are peak deceleration and energy return resulting from specific cell geometry, density, wall structure, and how these are defined across specific regions of the foot.
Together, these parameters determine how much deceleration occurs, how quickly forces are returned, and what kind of sensory information is sent back during the entire movement. In our view, tuning proprioception means tuning the quality, timing, and location of sensory feedback rather than simply making a product softer or firmer.
Within the design environment, these variables can be adjusted region by region based on the functional role we want each zone to play. Some areas may be designed to dampen load, others to guide transition, and others to enhance perceptual input and stability.
The broader Sequence (SEQ) philosophy is that movement is learned through signals, so design becomes a way of modulating those signals in a precise and individualized way.

How did you develop your custom pathfinding and infill logic within Houdini, and what design principles guided the way compressibility gradients are distributed across the surface?
Well as it turns out this is a fairly non-trivial problem, which many are probably familiar with. When 3D printing with silicone, because it is a liquid, toolpath continuity is critical in terms of the resulting print quality. As added complexity to this problem, Sequence has a patent on an alternating, stacked toolpath approach that was desired to be implemented in the pathfinding algorithm.
From the classic traveling salesman problem, to weighted shortest path implementation, and other algorithms, I cycled through many different approaches to path generation. Some would work perfectly in 90% of the cases but edge cases were a nightmare, others worked okay some of the time. In the end I chose a hybrid approach with some built-in error handling and fallback behavior that works mostly good, basically all of the time. Which, for a system that is intended to be used by a non-technical user across a wide range of input geometries, is I think an optimal approach. (“Mostly good, basically all the time” is coincidentally my new mantra.)
The non-uniformity aspect is actually fairly simple. I take in a series of scalar value fields as data input and measure their gradient ascent / descent on them in order to establish a vector field which propagates the pattern to higher or lower densities. With some control and smoothing this approach works pretty well, is computationally efficient and quite robust.
There is some more proprietary information on how these changes directly contribute to proprioceptive changes in the human body, but from an algorithmic perspective this has been the same implementation since version 1.0 of this tool.

How do you design a computational system that can be operated by a non-technical user or technician?
I would rank this as the second biggest problem to solve on this project. Many of us have built complex webs of computational systems, bespoke software plugins or applications, or both. I am a big believer in the elegance of algorithms or computational systems, and this project really allowed me to put my money where my mouth is in building a system that a complete novice can use and operate without much worry. The heart of this lies in the interface itself.
As a computational designer, you really have to shift your mentality to understand what your system does at the most fundamental level, and how a user might interact with the system. In many ways, you begin to design the system backwards, as you have to imagine first how someone is going to interact with the tool, how they expect it to behave, and problems that might arise.
Traditional computational systems are complex and nebulous, but when the tool is built for someone else to use fluidly, elegance and ease-of-use are key.
What do you hope to share with and learn from the CDFAM community through your participation this year?
We are excited to showcase the work done to date and offer a small glimpse into what the future of what this project holds. It exists at the confluence of contemporary biomechanics, additive manufacturing and computational approaches for truly bespoke product creation.
These events are always supercharged with energy and conversation, so we are very much looking forward to that aspect as well.

If you would like to join the supercharged energy, computational conversations and get a glimpse of what the future of design, engineering and architecture holds in store, join us for CDFAM Barcelona April 8-9 for talks of toolpaths, topology optimisation and tapas, and more…
See the full program, and register to attend.
Oh, and you can also check out David’s presentation among others from CDFAM NYC in 2025 to get an idea of what you will be missing out on if you do not attend.





