top of page
  • Writer's picturePatrick Law

Streamlining Prompt Engineering with ChatGPT: Tips and Tools for Success



Artificial intelligence has revolutionized the engineering industry, providing innovative solutions to complex problems. From improving manufacturing processes to automating repetitive tasks, AI has the potential to transform the way engineers work. However, one of the biggest challenges that engineers face when utilizing AI is crafting prompts that elicit the right responses.

Every industry has its unique set of customers and problems, which means finding the right prompts can be tricky. Fortunately, ChatGPT, a conversational model by OpenAI, is here to help. This interactive model is designed to provide appropriate responses within the context of a dialogue. It can assist with a wide range of tasks, such as answering questions, suggesting recipes, writing lyrics, generating code, and much more.

What sets ChatGPT apart is its ability to learn from human feedback. The model is trained using Reinforcement Learning from Human Feedback (RLHF), which means that it can improve its responses over time by learning from human input. This helps to reduce harmful and untruthful outputs, making ChatGPT a reliable and trustworthy tool for engineers.

In this blog, you’ll learn all about the different applications of prompt engineering and how it’s being used to push the boundaries in Process Engineering and other related engineering practices.


But before we get started, have you ever wondered what makes these prompts effective and precise?

Prompt components can vary depending on what you're trying to accomplish. Essentially, prompts are made up of different parts like instructions, examples, and guidelines.


Examples of prompt components include:

  • A question or statement that sets the context.

  • Specific keywords or phrases the model should include or avoid.

  • Input data or variables that the model should use.

  • Formatting or stylistic guidelines for the response.

  • Examples of desired responses or previous successful responses.

  • Constraints or limitations on the response length or complexity.

Sometimes, we found ourselves frustrated by the irrelevant or nonsensical responses that AI language models like GPT-4 produce.It's like trying to make a cake without a recipe - you never know what you'll end up with.

Thankfully, there's a solution: effective prompt engineering. Think of it as creating a recipe that guides the AI model towards generating the output you desire.

The Prompt Engineering methodology consists of three primary categories that work together to create the perfect recipe for your AI model:

  1. Prompt Formulation: This category focuses on crafting well-structured prompts that include specific details, context, and examples. It's like giving your AI model all the necessary ingredients and instructions for baking the perfect cake.

  2. Model Control and Guidance: Just like adjusting the oven temperature and baking time can impact the cake's final outcome, adjusting the model's parameters and guiding its thought process can refine the AI model's output. It's like tweaking the recipe to get the cake just right.

  3. Iteration and Improvement: Creating the perfect recipe takes time, and prompt engineering is no exception. By analyzing the AI model's responses and refining the prompts accordingly, we can achieve more accurate and focused results - like tweaking the recipe until it's just right.

Enhancing your prompts for ChatGPT requires a lot of process, and with that we came up with some tried-and-true strategies to help you get the best results, especially in the field of Engineering.

1. Clarity is key: Make sure your prompt is easy to understand and leaves no room for confusion. Tell the model exactly what you're looking for.


Sample Prompt: "Calculate the pressure drop in a 50-meter-long, 6-inch-diameter horizontal pipe transporting crude oil at a flow rate of 500 m3/h. The oil has a density of 850 kg/m3 and a dynamic viscosity of 0.001 Pa·s. Use the Darcy-Weisbach equation and assume a pipe roughness of 0.00005 m."


Explanation: By providing a clear and specific problem statement, the model can better understand the required calculations and input parameters. In this case, the prompt includes the pipe length, diameter, flow rate, fluid properties, and the appropriate equation to use, making it easier for the model to generate a more accurate result.


2. Provide guidance: Offer examples or context to help shape the response you want. This is especially useful when you need a specific format or structure.


Sample Prompt: "Determine the required heat exchanger surface area for cooling 1,000 kg/h of a process stream from 90°C to 40°C using water as the cooling medium. The process stream has a specific heat capacity of 2.5 kJ/kg·K and the overall heat transfer coefficient is 500 W/m2·K. Assume a logarithmic mean temperature difference (LMTD) of 35°C."


Explanation: By providing the necessary context, such as the specific heat capacity, overall heat transfer coefficient, and LMTD, the model has enough information to generate the correct calculations. This ensures the response will be relevant and well-structured.


3. Keep it short and sweet: Set a response length limit to avoid excessively long answers.


Sample Prompt: "Calculate the minimum size of a gas scrubber required to treat a gas flow of 1,000 m3/h, ensuring a maximum pressure drop of 250 Pa. Provide a concise answer."


Explanation: By setting a response length limit, the model is encouraged to give a focused answer that directly addresses the required calculation without unnecessary elaboration.


4. Break it down: For complex tasks, divide your prompt into smaller parts or questions, making it easier for the model to generate precise and relevant responses.


Sample Prompt: "Estimate the amount of heat required to vaporize a liquid mixture with the following steps: (1) Calculate the mass flow rate using the given volumetric flow rate of 200 m3/h and a density of 950 kg/m3; (2) Determine the heat capacity using the specific heat capacity of 2.2 kJ/kg·K and the temperature change from 25°C to 100°C; (3) Calculate the heat required using the heat capacity and the latent heat of vaporization of 250 kJ/kg."


Explanation: Dividing the problem into smaller steps helps the model to process the calculations sequentially, ensuring a more accurate and structured response.


5. Mix it up: Experiment with different ways to phrase your prompt, including alternative keywords, synonyms, or sentence structures, to find what works best.


Sample Prompt: "Compute the required compressor power to increase the pressure of a natural gas stream from 1 bar to 50 bar. The gas has a flow rate of 5,000 m3/h, and its specific heat ratio is 1.4. Use the isentropic formula for compressor power calculations."


Explanation: By experimenting with different phrasing, the model may better understand the task at hand, leading to more accurate results.


6. Fine-tune the settings: Control the focus and randomness of the response by adjusting the model's temperature and top-p settings. Lower temperature values lead to more focused replies, while higher values add some unpredictability.


Sample Prompt: "[temperature:0.5] Calculate the required cooling tower size to dissipate 1 MW of heat, given an approach temperature of 5°C and a range of 10°C. Provide the answer in m2."


Explanation: By adjusting the model's temperature setting, the response will be more focused and less random, increasing the likelihood of an accurate and relevant response.


7. Foster critical thinking: For intricate questions or issues, ask the model to outline steps, factors, or pros and cons before giving a final answer, ensuring a more thorough and well-rounded response.


Sample Prompt: "Assess the feasibility of using a distillation column for separating a binary mixture of hydrocarbons with the following considerations: (1) List the key properties affecting separation efficiency, such as volatility and relative volatility; (2) Discuss potential operational challenges, such as flooding and foaming; (3) Propose any alternative separation methods if distillation is not suitable."


Explanation: By encouraging the model to consider multiple factors and evaluate alternatives, the response will be more comprehensive and well-rounded.


8. Keep improving: Prompt engineering is a continuous process. Test various prompt versions, evaluate the outcomes, and fine-tune your prompts based on the insights gained.


Sample Prompt: "Version 1: Calculate the pressure drop in a pipe transporting oil.

Version 2: Calculate the pressure drop in a horizontal pipe with a 6-inch diameter, transporting crude oil at a flow rate of 500 m3/h.

Version 3: Calculate the pressure drop in a 50-meter-long, 6-inch-diameter horizontal pipe transporting crude oil at a flow rate of 500 m3/h, given a density of 850 kg/m3 and a dynamic viscosity of 0.001 Pa


Explanation: By continuously improving the prompt, we increase the likelihood of obtaining a more accurate and relevant response from the model.


As the world of Engineering and AI continues to rapidly converge, it's essential to stay up-to-date with the latest techniques, applications, and limitations in prompt engineering. With this comprehensive guide, you can unleash the full potential of AI language models and create conversational systems that are not only informative but also engaging and dynamic.


SOURCES:


  • Popovic, M. (2023, April 3). Learn Prompt Engineering: Your Ultimate Guide to Mastering the Skill. Kanaries.net.

https://docs.kanaries.net/articles/chatgpt-prompt-engineering


  • Rollins, M. (2023, March 24). Prompt Engineering for ChatGPT: "PMCI"​ Method. Linkedin.com.

https://www.linkedin.com/pulse/prompt-engineering-chatgpt-pmci-method-mark/


  • Follonier, F. (2023, March 31). ChatGPT Prompt Engineering: A Practical Guide for Businesses. Relataly.com

https://www.relataly.com/mastering-prompt-engineering-for-chatgpt-a-practical-guide-for-businesses/13134/


  • Gewirtz, D. (2023, April 10). How to write better ChatGPT prompts (and this applies to most other text-based AIs, too). Zdnet.com.

https://www.zdnet.com/article/how-to-write-better-chatgpt-prompts/


34 views0 comments

Comments


bottom of page