Survey Design: Okay, but how does it _feel_?
One would think that we’ve more or less figured survey UI out by now. Multiple choice questions, checkbox questions, matrix questions, dropdown questions, freeform textfields, numerical scales, what more could one possibly need?
And yet, every time I led one of the State Of … surveys, and especially the inaugural State of HTML 2023 Survey, I kept hitting the same wall: how the established options for answering UIs were woefully inadequate for balancing good user experience with good insights for stakeholders. Since the State Of surveys used a custom survey app, in many cases I could convince the engineering team to implement a new answering UI, but not always. After joining Font Awesome, I somehow found myself leading yet another survey, despite swearing never to do this again. 🥲 Alas, building a custom survey UI was simply not justifiable in this case; I had to make do with the existing options out there [1], so I was once again reminded of this exact pain.
So what are these cases, and how could better answering UI help? This case study is Part 1 of (what I’m hoping will be) a series around how survey UI innovations could help balance tradeoffs between user experience and data quality.