Walmart has set ChatGPT guidelines for its employees. Walmart Global Tech warned employees in a memo not to enter confidential information into ChatGPT. Walmart said it previously blocked ChatGPT due to "activity that presented risk to our company." The new guidelines also tell Walmart employees not to share customer information with ChatGPT. Top editors give you the stories you want — delivered right to your inbox each weekday. Something is loading. Thanks for signing up! Access your favorite topics in a personalized feed while you’re on the go. Email address By clicking ‘Sign up’, you agree to receive marketing emails from Insider as well as other partner offers and accept our Terms of Service and Privacy Policy . Walmart had a clear directive for its employees Tuesday regarding generative artificial intelligence, like ChatGPT : Do not share any information about Walmart with the rising technology. In an internal memo to employees, Walmart Global Tech, the retailer’s technology and software engineering arm, said it had previously blocked ChatGPT "after we noticed activity that presented risk to our company." The memo, which Insider has viewed, added: "We’ve since taken the time to evaluate and develop a set of usage guidelines around […]
Click here to view original web page at www.businessinsider.com