top | item 47160578

(no title)

XCSme | 4 days ago

Strong agree on clicks over surveys. Once I move from a video prototype to a live MVP, the real signal comes from watching what people actually do.

I built UXWizz mainly for this. Self-hosted heatmaps and recordings make it pretty obvious where people get confused or drop off, and you don’t have to rely on polite feedback [0].

[0] https://www.uxwizz.com

discuss

order

zhongyongxu|4 days ago

Strong agree — clicks > surveys, and behavior > opinions.

The gap I'm trying to fill: before you have the live MVP (and before you invest in heatmaps/recording infrastructure), how do you know which workflow is worth building?

Video prototypes are the "pre-MVP" behavior test — show the experience, see if they click "I'd pay for this" vs just "interesting."

Curious: When building UXWizz, did you validate the "self-hosted vs cloud" decision with video prototypes, or did you ship and learn from early user behavior?

Feels like your tool captures the truth post-launch, mine tries to predict it pre-launch. Complementary approaches.