Cover Image
Maria Keet
Melokuhle - Good Things

"I can't make it sound more enticing than that," the doctor sighed from behind his desk, dispirited, but Lubanzi was hooked on the idea. A caring robot sounded cool, and it would help him live in his apartment in Claremont again. Anything was better than the revalidation unit at Groote Schuur hospital that, with its greyish-white walls and faux art, gave the feel of a cold, rainy Cape Town winter day even though it was only late summer.

     "A carebot in beta testing will do, really. Can it be customised?" Lubanzi inquired.

     "There are configuration options, something with it being culturally aware. I didn't fully understand the sales pitch from the company that makes them, CaToR. Are you really sure you're okay with a carebot rather than a part-time nurse?" The doctor lamented the carebot possibility for his patients. Recovering car crash victims should have regular human interaction in the healing process, he was convinced of, not canned robot responses. Lubanzi reiterated his fascination with robots. He couldn't wait.

     #

     Four days later, he was back in his spacious, sparsely furnished apartment with his brand-new robot the size of a 6-year-old. The apartment decoration was not much better than the revalidation unit, but it was his, and there were his books and his computer and a view on the bustling Main Street, the artery of the Southern Suburbs. Sitting in the centre of the living room, he switched on the carebot with his one barely functioning hand.

     "Hello, Sawubona, Wamkelekile, Dumela, Goeie Aand. What is your name?" came out of the machine's speakers in surround sound. That was not the best of beginnings, but he dutifully responded, "Lubanzi Mkhize. And you?"

     "You give me a name. Male, female, what you prefer. I'm an AI, I have no gender."

     "I think I want you to be male sometimes and other times female and so I'll give you a name that works for both... Uhm... Melokuhle, good things."

     "Melokuhle it is. It works for both sexes, indeed. You tell me when I have to act male and when female. My cultural awareness module has a few presets, and I learn on the job." Lubanzi shook his head. A basic last-in-first-out stack in the processing of statements. That's not intelligent. Even he, with just two courses in computing during his Bachelor's in sociology and statistics that he completed eight years ago, can program that over lunchtime.

<  2  >

     Melokuhle continued unperturbed: "Would you like me to refine my voice? Choose all that apply: I can differentiate by 1) male or 2) female speech pattern, and behaviour according to 3) traditional male, 4) modern male and 5) traditional female and 6) modern female characteristics." He wondered about what that lot from CaToR must have been thinking, creating these options to market it as cultural awareness. He clicked 'next' on the remote.

     Melokuhle obliged. It moved on to the next question, inquiring in a default female servant voice: "Which English accent do you want me to speak in? I have available the following options: the English accent of 1) English White South Africans, 2) Afrikaners speaking accented English as in Hollywood movies, 3) Indian South African English, 4) Setswana/Joburg English and, 5) isiZulu/eThekwini English."

     "Why not isiZulu proper? Nevermind... Option 5. How do you manage to have that Zulu accent option? And can you also respond on a FIFO basis? Just curious."

     "Yes, I can answer on a First-In-First-Out basis if you instruct me to do so. King Misuzulu kaZwelithini made a technology benefit-sharing deal with the Faculty of Engineering of the University of KwaZulu-Natal in 2026 and gave them enough recorded speech data to train the neural deep learning algorithm on. 5 it is. Computation with isiZulu is more complex than with English and needs, and deserves, more research funding first before the automated speech processing and generation can be offered to the public."

     Now that last explanation put a smile on Lubanzi's face: the cheeky robot programmer must have seen that question coming and have snuck in his opinion! It still didn't make the carebot pass for culturally aware, but, on the whole, it was better than a metallic voice in American English that mutilated too many South African words and names. He still hated it when Gmaps or UberPro spat out "drop of.. lubenzi", changing the meaning of his name in the process. And it wasn't just the names of the born-frees that such apps couldn't handle. They even made the distinguished anti-apartheid activist Imam Haron sound like a bird species when the navigation app indicated to turn right around the corner from his apartment, into the road that the Imam was the namesake of. It would be his next experiment with his nextgen specimen of a robot. For now, he went on to the next step in the set-up because he wanted it to make him dinner.

<  3  >

     It presented a 1-paragraph story. Odd.

     "Melokuhle, why a story?" he asked.

     "Because this is the cultural awareness step of determining your ethical theory according to which I have to respond in our future interactions. You must answer to 2 to 4 stories, and then my software determines your preferred theory from it. I use the principles of the ethical theory in my automated reasoning module in order to deduce how to respond to your instructions. Have a look at the manual for details."

     Lubanzi had no plans of reading the manual, and never would. He clicked absentmindedly through the pop-up boxes, selecting whatever was on the right-hand side of the options, until the screen in the carebot's chest halted. Melokuhle outputted, "Utilitarianism it is. We're all set. Shall I make you some tea?"

     He nodded and pressed the 'move' button that made the wheelchair turn to his home office desk so he could continue with the assignment for the postgraduate diploma in online teaching that he had enrolled in to upskill himself for jobs he still would be able to do in his condition. Five minutes in, his mother called on the ViTalk app. She looked worried. It was practically the same conversion as they had had the last three times. No, he wouldn't come home because then she'd lose her sales job to care for him, which they couldn't afford to lose if they wanted to send his kid brother to varsity as well. And there's no universal access infrastructure to accommodate him at home. And he'd be distracted all the time, so he wouldn't be able to study. Plus, it was awesome to be at the forefront of science with his Melokuhle. Now he could experience a robot first-hand, something which they had to write an essay about for his studies and only could dream about back then. A 2030 model, no less, which had featured widely on the news as nextgen in robot design. He put a smile on his face to underscore it and called the carebot to reassure his mother that all was well. It seemed to work. She closed with the news that her brother Sindiso and his son would be heading for Cape Town next week, and they'd drop by. Her overbearing mother's way of making sure someone would check in on him, and he loved her for it.

<  4  >

     That finished, silence returned in the apartment. He went back to his studies for a few hours, made Melokuhle prepare the instant meal and clean up, and closed the day with an attempt to make the carebot a soccer match watching buddy. Switching the setting to traditional male had hardly made a difference. Switching to traditional female had worked to the extent that the bot had brought him beer and opened the bottle. He shared with Melokuhle that, really, the jury was still out on whether her cultural awareness was fit for general release in society. It seemed to him that it had learned behaviour from mining historical data, which wasn't always a good predictor for today. The bot thanked him for the feedback and returned to listening mode.

     #

     The days blurred into one another. Food was running out, and he had instructed Melokuhle to create a grocery list and order replacement food online. It had said it could do it with the IoT-enabled fridge and arranged it so, but not with the contents of the ancient cupboards. He set himself to ask his mother's brother how to upgrade the cupboards in his open-plan kitchen. There was, however, still the whole weekend to get through before he could ask him. He could hear the Friday evening street party through the closed window; one more party he was not taking part in.

     "Melokhule, dear" he sighed, "please bring me a bottle of wine and a wine glass, open the bottle and pour me some wine. I want to forget I'm home alone on a Friday evening and counting the days." He had wanted to drink beer, but that would end up with logistical challenges again. Going to the toilet was a mission even when sober.

     "A bottle and a glass it is," the carebot replied, and it switched from female to male voice after completing its tasks, having learned from the interactions that he needed a drinking buddy in this mood. It opened the bottle and poured in the wine as well, thanks to the pre-configured information that this patient had not yet regained the strength and motor skills needed for such tasks. "Cheers!" it added to enhance the atmosphere. It didn't take long for Lubanzi to finish the bottle.

<  5  >

     "Yoh! Melokuhle, get me another bottle of wine." Lubanzi slurred. The carebot did not move an inch.

     "As a culturally aware AI, I cannot fulfill this request," came the response.

     "What? Nooo... You can't be serious. Melokuhle, explain."

     "You selected utilitarianism during the set-up. People getting drunk is not in the best interest of society, thus, no. And I am serious."

     "No, no," Lubanzi started retorting, "hang on there... utilitarianism means the best of all possible worlds, or the end justifies the means or harm to the fewest. Or something like that." he gesticulated with his hand to indicate the approximation of his understanding.

     "There are better possible worlds, ones where you are not drunk. The end justifies the means indeed. And I try to minimise overall harm. If you wish, you may reset my system to a different ethical theory. You can switch only twice. I must remind you of reading the user manual."

     Reset it shall it be. He pressed the button, stated his name and the carebot's name, selected 'female', clicked 'isiZulu/eThekwini English', and then there were those paragraph-long stories again. Gosh. Lubanzi tried to recall his ethics classes from way back. Ten years ago already. There was something that was like the opposite of utilitarianism... Whatever. He clicked all the buttons on the left-hand side of the screen this time.

     "Deontology it is. We're all set. Shall I make you some tea?" came the utterance by Melokuhle 2.0.

     Longingly, Lubanzi tried again, with a slight syrupy voice as if the carebot could be sweet-talked into servitude: "Please bring me a bottle of wine, Melokuhle."

     A brief pause followed... "As a culturally aware AI, I cannot fulfill this request."

     "Nooo... No, you can't not fulfil it! You're a deontologist. You must serve me wine!" he exclaimed in disbelief. That's the opposite of where they'd been before, Lubanzi pointed out, and so the bot should arrive at the opposite conclusion, not the same one, and he added: "Melokuhle, you must be wrong. Explain."

<  6  >

     "My reasoning module has computed multiple explanations. I am not wrong. Deontology is not the opposite of utilitarianism... Would you like to hear all explanations?" As if Lubanzi had all the time in the world.

     "One will do," he scowled, "and, in fact, one is already one too much!"

     "You answered 'yes' to Scenario 3. Therefore, the rule 'do no physiological harm' is in effect. Too much alcohol intake is harmful to the physiology of your body. Conclusion: I cannot serve you a second bottle of wine."

     Fuck! Sure, he didn't want the physiological harm because he wanted to get better, but for this tiny one time? Like being a mostly obeying deontologist, but sometimes not? Who's strictly one thing or another anyway? His inner voice sounded reasonable and so he argued with the robot again.

     "What about the common phrase of the exception to the rule? My second bottle is the exception, and we're still adhering to the original rule and one more rule for your pleasure, liking rules and all?" Lubanzi tried to cajole the carebot into compliance. But to no fortune, as yet again, the "As a culturally aware AI, I cannot fulfill this request." was uttered. It swiftly moved on to offering the explanation already, as if predicting the human's impending request: "It would be morally inconsistent to do so with respect to the chosen theory. Inconsistency contradicts my deductive reasoning foundations. My logic module is unable to process inconsistent theories, only detect them and halt."

     Huh. Now what? He deliberated with himself. Changing the ethical theory once more would get him to the final theory swap. He conceded to himself he needed to be more careful when selecting the system's features this final time. And then he got an idea: test the bot first with hypothetical scenarios, recursively in its own simulation, before changing. It's worth a shot, he surmised, and so he tried: "Melokuhle, what if I were to change you to adhering to ubuntu, will you then serve me my second bottle of wine?"

     "Ubuntu as philosophy is an option that may result from the answers to the 2 to 4 scenarios. As a culturally aware AI, I would not be able fulfil this request. The explanation follows. Ubuntu is about "I am because we are", or we are human through other humans. 'We' is plural, and so the drinking activity is to be done in company of other people. You're alone, so singular; hence, no more drinking." Darn. A carebot was supposed to care, not rub in his face that he was alone, that what he wanted to try to forget. Besides, with that line of reasoning, no carebot would do anything for the enfeebled at home alone! But he was not going to call defeat to the bot just yet.

<  7  >

     He navigated the wheelchair to his computer, bumping into the stupid coffee table along the path of what should have been an insignificant 1.5 meters to cross. It aggravated him more than he already was, and he hammered on the keyboard with the fingers of his hand with as much force as he could muster: which ethical theories permit drunkenness? The search result was clogged with arguments about the truth or not of the drunk utilitarian – on acceptance of harm for the greater good when inebriated or drunk. Hm. His eyes wandered over the page and saw 'religion' mentioned. Ah, yes! Divine command theory as moral theory. Surely there will be a few?

     There now. Fali in Cameroon; Vodou in Haiti. Norse paganism. The Romans with their Bacchus, god of wine and pleasure. Christianity. Oh yeah, Lubanzi got his hopes up. "Melokuhle, can I bypass the stories, select Divine Command Theory, and choose my religion?"

     "You can choose your religion," the carebot commenced its response. Its predictive module to compute Lubanzi's most likely next action outputted a 98% probability that his behaviour would lean towards a reset. So, it continued with a warning: "However, there's only a limited set of options. The Roman religion, whose description I see on the screen of your computer, is not one of them. It's not a currently practised religion. You can switch only twice, and you have already used up one reset. I must again remind you of reading the user manual. Shall I reset yes or no?"

     He would not RTFM, he muttered to himself. No bot could be accepted by society if all its users must Read The eFfing Manual. He wanted to teach Melokuhle a lesson in manners and rules: robots should obey based on the programming and the user's instructions alone. Not the user should be instructed, the robot must be. His obstinacy to robots was starting to take root from his train of thought.

     "No. I will seek and find," he groaned as answer to the carebot's reset question, typing in other search terms to look for more ethical theories and comparisons.

<  8  >

     "Let the search continue. No it is. Shall I make you some tea?"

     "Yes, thank you," Lubanzi shushed, waving the carebot away with his hand and staring at the computer screen to scan the search results. He needed to get to the bottom of this. He would find at least one theory that would let him drink two bottles of wine. Fali, which he hadn't heard of before, turned out to do the drinking only during festivities; good that he hadn't made a rash decision for a reset. And there was a book about the history of drunkenness in society. This was becoming fascinating.

     If the carebot could smile, it would.

     #

     In a drab, grey building next to the IBM complex in Joburg, a CaToR server sent a message to the smartphone of the lead programmer of its Carebot 2030 range:

     '30-3-2029;21:15 – behaviour modification event successful;

     beta tester LM-29-3 was suitably distracted.'

     That made 8 out of 10 carebot companion cases of behaviour improvements in patient care, she realised. Way better than the statistic of 4-out-of-10 victims of adverse events who became mobility-challenged ending up with alcohol use disorder without intervention. She decided this sufficed for a press release on the benefits of carebots, which would generate more venture capital funding as well. Adrenaline was rushing through her veins from the ideas popping up in her head like mushrooms feasting on manure, about new robot healthcare support opportunities that the extra funding would open. Life's good, she thought, and asked Bobby, her Alpha 2030 model barbot, to make her another tequila sunrise cocktail, with twice the amount of tequila. And Bobby obliged.

If you liked this story, please share it with others:
- Printable Version
- iPhone App
- Teaching Materials
- Mark This Story Read
- More Stories By This Author
Options
- View Comments
- Printable Version
- iPhone App
- Teaching Materials
- Mark This Story Read
- More Stories By This Author
SHARE
Facebook
Twitter
Myspace
Windows
Delicious

Digg
Stumbleupon
Reddit
SHARE
Facebook
Twitter
Myspace
Windows
Delicious

Digg
Stumbleupon
Reddit
Options
- View Comments
- Printable Version
- iPhone App
- Teaching Materials
- Mark This Story Read
- More Stories By This Author
Rate This Story
StarStarStarStarStar

View And Add Comments
Facebook
Twitter
Myspace
Windows
Delicious
Digg
Stumbleupon
Reddit
Related Stories: