The Dark Shadow Shrine

If u need coaching in GP or 'O' level English, u can reach me at 91384570. In Singapore only hor....Scan QR code in profile pic for testimonials by ex-students; or click: https://docs.google.com/document/d/1dUpvamlW4bDWjhARIERriwQCwkLOJ_03/edit?usp=sharing&ouid=117308433027458335265&rtpof=true&sd=true

Wednesday, July 02, 2025

Some chatbots tell you what you want to hear. This is dangerous

Click HERE
Just like their people-pleasing human counterparts, a growing suite of technology tools and features are wired only to make people happy or delude them into feeling so.

The key people-pleasing trait of chatbots is sycophancy by design. When chatbots are deliberately engineered to affirm users and agree with them, emotional attachments between users and bots are forged. For AI companies with subscription-based business models, these affective bonds ensure that users seek to remain connected and engaged. Furthermore, sycophancy helps users feel validated and accepted by bots in what feels like a judgment-free environment, encouraging people to chat with them endlessly.

Such people-pleasing design logics can be harmful, especially when they tell users what they want to hear or show them only what they want to see.

It may please us to see an improved version of ourselves online, but when our actual selves simply do not measure up, feelings of inadequacy would naturally arise. Indeed, there have even been reports of people undergoing plastic surgery to make themselves resemble their filtered selves so as to feel more complete. 

This problem is compounded by the prevailing design ethos of the tech world which is  allergic to friction of any kind. Any speed bumps that slow down or challenge the user experience are anathema and must be swiftly engineered away. Yet the messiness of everyday life is fraught with the very things that technology seeks to eliminate: imperfections, contradictions, and the discomfort of being told “no”. So when we are habituated by technology that serves only to charm, cajole and comply with our wishes, we are setting ourselves up for grave disappointment. 

If we allow technology to become the ultimate people pleaser, we may find ourselves surrounded by tools that, in their effort to never upset us, quietly lead us astray.

For issues on dangers of AI.....