Normally, how an A/B test works is that you present those options to different users and then you compare their behavior to some designed outcome. It is almost never a matter of asking users to choose between options. You are just giving them the different versions and then measuring their engagement levels. Engagement often measured by sales or click through rates. The one with the most engagement wins.
https://www.usertesting.com - you can upload screenshots / prototypes / the actual webpages (if they exist) and have users go through a series of questions. You can also set it to scramble the order to see how that (if at all) changes preferences.
As other folks called out, this can give you a sense of customer comprehension / stated preferences & reasons why, but may not give you the level of confidence a true A/B test would (launching both iterations & splitting traffic amongst & measuring conversion difference)