Personalization should feel so seamless to a customer that they shouldn’t even notice when their communication is relevant.
The only time they will notice it is when you get it wrong. I love to tell stories of when I was stunned by the errors in personalization AI
- I ordered a ball chair from Amazon for my office. I love my ball chair. It keeps my abs engaged, and my posture straight; it prevents me from crossing my legs and allows me to lean back every once in a while. The problem is the months of ball chair recommendations I received after this purchase. How many butts does Amazon think I have? Do I collect ball chairs?
- Also from Amazon. I got a recommendation for a menorah. Good targeting: I happen to be Jewish and light a menorah with the kids at Hannukah (in December). The problem? It was sent to me in March!!! The algorithm didn’t take timing into consideration.
- When Sears Canada was around, I bought shorts for my then 5-year-old and was recommended a crib to go with the shorts. The timing couldn’t have been worse: I had just suffered a second-trimester miscarriage. The ad made me cry.
- I was shopping on a Canadian website of an American kids retailer and had 4 items in my cart. At checkout, I received a message that 3 items were not available to be shipped to Canada. Thank you, but which 3? And couldn’t they have told me on the product pages, or better yet, not shown them to me in the first place?
Clearly, personalization at scale requires human expertise beyond the algorithm. I would like to help you build a program that avoids such blunders.