The answer comes down to the project budget. Clients with big budgets can afford expensive GPUs from AWS, while those with more limited budgets may require that developers put together a FastAPI and BERT solution to work with a CPU in a virtual setting using Vast.ai. It all depends on the specific business case and available resources.
Upskilling: Learning More About AI Development
I would assume that in the short term, we will see some smart individuals and companies using NLP, LLMs, and statistics to analyze—and keep an eye on—the competition. There are many great articles on this topic; for example, this one discusses how to monitor your competition using Google Bard. In the long term, I believe these tools and practices will become more commonplace for everyone to use, leveling the playing field.
Great question. I believe we definitely need to start preparing to teach AI basics to high school students (or even younger ones). One of the most powerful lessons for students to take to heart is that AI is not magic. At least today’s AI is not sentient; it is simply math. If the next generation could learn the foundations of AI and what’s under the hood, they might fear it less and be more inspired to experiment with it.
Hands On: Leveraging Artificial Intelligence, Machine Learning, and Large Language Models (LLMs)
—M.T.Z., Islamabad, Pakistan
How do you envision technologies like NLP, AI, and CV impacting search engine rankings? For example, how does ChatGPT affect SEO?
—M.D., Seattle, United States
Can you elaborate on the limits of AI predictive analytics? Which algorithms and technologies do you prefer for conducting AI predictive analytics and best estimating accuracy?
I’m seeing the current AI hype about how AI will revolutionize our lives, and it seems like it is here to stay and has the potential to accelerate future innovation. What are the absolute basics of AI that you think should be taught at high schools?
AI is already extremely embedded into healthcare. Fortunately (in my experience), funding isn’t always a problem in healthcare, so there is great potential for future AI innovation. Out of newer research efforts, what I find the most fascinating is using deep learning for drug discovery (e.g., identifying antibacterial molecules). Though this is technically chemistry, it will have many applications in healthcare, and I believe it will give a huge boost to the future of humankind. However, one concern I have is that the many regulations and approval processes in this field move so slowly—especially compared to AI.
—B.S., Amman, Jordan
My advice is to follow your passions and interests—if you find AI/ML exciting, give it a go and don’t depend on pre-built solutions or other engineers. On the other hand, if you don’t have time or don’t see a future with AI or ML, then pre-built products are a great option, especially since we’ve been in the midst of an unprecedented boom for AI tooling in the past six months or so. In one sentence: Choose your battles wisely.
What are your thoughts on the new AI chip being released by AMD? Is it going to revolutionize computing?
I don’t think we are yet at the point where we won’t need developers (though I’d estimate we could be in 10 to 15 years). Turning toward the near future, I would predict that AI may not be optimal for addressing edge cases, customizations, and the many special requests often desired by clients. So I would advise learning how to use generative AI to save time writing boilerplate code. Save your brainpower for tasks like ensuring the code works as intended in various scenarios. Instead of spending 40 hours developing one program, maybe you’ll work on 10 programs.
—L.U., Curitiba, Brazil
I know it’s a boring answer, but I don’t think we have the data needed yet to know if this chip will truly revolutionize computing. However, on a more insightful note, I was pleased when I saw the announcement because it brings competition to other AI chips—and I don’t believe that a monopoly is great for anyone.
That’s an interesting and tough question. Regarding the limits, I think before we predict something, we should analyze whether it is predictable and whether the needed data is available. It is easy to believe we can predict everything with AI, but unfortunately, we’re not there yet. Regarding preferred algorithms, I have a keen interest in neural networks, but I think decision trees are also great when solving specific problems (e.g., regression analysis).
—A.D.R., Como, Italy
Considering that LLMs have started to write code, what are the primary hard skills I should learn to stay competitive as a developer and implement AI into engineering processes?
—M.Z., Santa Clarita, United States
As a developer with no experience in AI/ML theory, what is the best way I can start leveraging machine learning or artificial intelligence technology when building products? Is relying on pre-built, black box solutions (e.g., Amazon Rekognition or Textract) naive? Is it worth the time and effort to understand the theory behind everything?
—M.D., Seattle, United States
This wide-ranging Q&A is a summary of a recent ask-me-anything-style Slack forum in which de Oliveira fielded questions about AI from other Toptal engineers around the world. It starts with the most important current and future applications of AI for modern businesses, then moves on to more advanced AI and machine learning questions for technologists.
—S.L., London, United Kingdom
I’m working on LLM model deployment in production. I plan to create an API for the model using FastAPI and deploy it to Hugging Face or another cloud platform. Are there any alternative options or methods to consider?
Having worked with data and technology across major industries like healthcare, energy, finance, and supply chains for more than a decade, Toptal AI developer Joao Diogo de Oliveira has a uniquely comprehensive perspective on the practical applications of AI. In the last six years, he has focused on AI and machine learning (ML), tackling the field’s most critical areas: prediction models, computer vision (CV), natural language processing (NLP), and large language models (LLMs) like GPT.
Is it possible to extend an LLM to answer questions in real time (or within a few hours)?
—K.C., Berlin, Germany
Can you suggest helpful resources, tools, frameworks, or sample projects for those hoping to become AI or ML engineers?
I would suggest starting small and focusing on NLP first. Once you are versed in NLP fundamentals, you can explore LLM nanodegrees through online learning platforms to understand core concepts like embeddings and transformers. Last but not least, I’d recommend playing with Hugging Face, which should be easy since you have an AI background.
How can ML and NLP technologies be efficiently integrated into Firebase?
Based on your experience, what are the primary applications and benefits of AI in healthcare? What do you see as the future of AI in healthcare?
—M.D., Seattle, United States
—D.P., Bengaluru, India
Yes, it is. Obviously, there’s always some latency, and the bigger the model, the longer it will take to generate predictions (or the more GPU resources will be required).
I’d recommend two main resources. First, nanodegrees (online certified programs) are a great place to start. Stanford Online’s machine learning coursework is beneficial if you’re new to AI and data science. Second, to build up your experience and start playing around with AI/ML technologies, Kaggle projects and competitions are valuable resources that offer many opportunities to network and learn from others.