Virtually overnight, AI language models such as ChatGPT, Microsoft Copilot, and Gemini have become essential gadgets in the knowledge worker’s toolkit. However, while these transformative tools have the power to supercharge productivity in some areas of business operations, they can seriously backfire when used for the wrong tasks – risking quality and reputation, or worse. The success of organizations is increasingly reliant on leaders’ ability to discern some basic dos and don’ts of AI. 

Don’t use it to replace your human content marketers

How to help your team stop worrying and love transformative tech

Setting aside the copyright risks and factual errors associated with AI-generated media, copy written by a bot often sounds like… well, copy written by a bot. 

“Nothing straight out of AI is shippable to a client,” warns Ryan Waite, VP of Public Affairs at the marketing firm Think Big. “Best-case scenario, it requires light editing, and worst-case, it’s not usable at all.” 

While a human writer has developed the craft to present information in fresh ways that stick with an audience, generative AI models reconfigure information according to previously established patterns, which can make for a hackneyed end result. Brand communications are supposed to convey domain authority and integrity to an organization’s clients and stakeholders. The stilted prose of a chatbot is at cross-purposes with this goal. 

Norty Cohen, an entrepreneur and the author of Prompting Originality: The A.I. Handbook for Humans, sums it up: “AI can make a lot of content, but it can’t tell you if it’s any good. That takes human intelligence.”

Do make it a content brainstorming buddy 

AI is changing work for the better – here’s how

Although AI models are lousy creative producers, they can be useful creative partners for organizing research, creating outlines, brainstorming ideas, and even prototyping concepts for market testing. AI models can also be valuable tools for leaders whose jobs sometimes call for writing – but who aren’t professional writers themselves. 

Anne Kim, the CEO of Array Insights – an AI platform company serving the healthcare industry – uses AI to sketch out the rough contours of her first drafts and, later on, to give her documents a final grammar edit. “But what happens in between has to be done with a human,” Kim says. “I have yet to find an AI that can perfectly understand my business better than me, or articulate it perfectly for my audience.” 

Brandon Rollins, who has a background in IT and is the founder and CEO at Pangea Marketing Agency, sometimes gets the creative juices flowing by asking an AI to give him an opinion based on what a famous business professional or philosopher might say. “I’ve gotten a lot of good ideas out of ChatGPT by asking it to roleplay as Nietzsche or Laozi or any of a billion other big thinkers,” he says. “While ChatGPT misunderstands their philosophies, it nevertheless gives really interesting answers.”

Don’t use it for complex mathematics

It would be reasonable to presume that AI language models are good at math. After all, isn’t math a language like any other? Well, not exactly. 

“Asking AI models like ChatGPT and other LLMs to run complex mathematical equations is generally a bad idea,” Rollins says. 

Why? Solving complex math problems goes beyond the probability-based pattern-matching capabilities that inform how current language models respond to prompts. It requires deep reasoning skills and a human-like capacity for logic that technologists haven’t yet figured out how to program into machine learning systems. AI researchers believe that once this happens, it will signal a new era of hyper-intelligent AI power. For better or worse, we aren’t there yet. 

Do use it for finding data insights

AI models can’t reliably solve complicated math equations, but they can crunch large volumes of data very quickly. 

Rollins says that he has personally used ChatGPT and Gemini to write Excel formulas, including complex Visual Basic for Applications (VBA) code for Excel macros. “That VBA code ultimately helped me make a ‘bank balance predictor’ for my business, and one which has had a pretty good track record for the last several months,” he explains, with the caveat that his technical know-how likely helped the process along.

Programmers can also save time by using AI chatbots, such as GitHub Copilot and Claude, as coding assistants. “Github Copilot assists me with various aspects like summarizing and documenting code, writing test cases, and generating starter code that I can use to build upon,” says John. He cautions, however, that “the most important aspect of leveraging AI for any task is to validate and verify the insights generated by AI, since AI models are notoriously known to generate incorrect information.”

Don’t neglect its potential to supplement learning 

Dr. Fei Fei Li on maintaining our humanity as we expand the boundaries of artificial intelligence

Lifelong learning is crucial for ongoing professional relevance in a fast-changing world. These days, that means getting comfortable with new technologies – such as, of course, AI. But it can also mean using AI tools for learning. 

Today’s knowledge workers have access to a growing suite of AI products designed to assist with more complex tasks, including learning. In fact, facilitated learning is a core pillar for Rovo, one of our newest products here at Atlassian. Using our own proprietary data in addition to third-party data from users’ organizations, our AI-powered product is calibrated to make critical information easier to find and make sense of. Rovo is just one example of how AI tools can help equip individuals and teams with the knowledge to be successful. 

Do upskill your AI savvy

It’s a good idea for people’s lifelong learning missions to include a mastery of AI tools. There are free online courses aplenty for learning the ropes, in addition to ample blogs and resources that can help leaders develop an AI strategy

To really up one’s AI game, nothing beats first-hand experience. “AI is very experiential and personal, so to really upskill, you have to be willing to try it – to see what it can or can’t do firsthand,” says Joshua Rickel, VP of Product at Perigon, an AI platform company that distills and assesses real-time news and public-opinion insights. “Treat it like another employee or intern that supports you personally. Lay out the steps you would take and see how they work. There are all sorts of things that it can do that will surprise you if you invest a bit of time in the experiment.”

Special thanks to Alfredo Huitron for his contributions to this article.

Do this, not that: A user’s guide to generative AI