Simulating Virtual Twins
Simulations are a cornerstone of a virtual twin approach to designing products and understanding scientific phenomena. A huge variety of predictive computational tools covers systems of all imaginable sizes. For example, recent computational studies related to the COVID-19 response range from atomistic simulations of viral proteins to computational fluid dynamics of entire hospitals.
As a general rule of thumb, simulations become more complex and detailed as we move from macroscopic to microscopic to nanoscale. In most cases, this also comes with increased computational cost for modeling a sample of a given size. For example, a system description of a simple lever spring could consist of a single equation with one variable (its position). It would be nearly impossible to model all the atoms simultaneously—even for the very small, spring-like cantilevers used in atomic force microscopes.
On the other hand, trying to understand the influence of a drug molecule on a protein receptor or the effect of chemical composition on the performance of a catalyst by simulating each molecule as a single particle would lack the chemical detail to be meaningful.
Choosing the right simulation method
Selecting an appropriate simulation method for any given problem is a highly nontrivial decision even for seasoned professionals, especially when it comes to mesoscopic and microscopic choices. This post attempts to provide a straight-forward rule to help you identify the correct simulation technique— specifically when you should make the switch from macroscopic (part-level) descriptions to particle-level calculations (atoms or beads), or when you should move from atomistic simulations to an electronic (quantum mechanical) representation.
The key to selecting the right computational method is to identify the underlying fluctuations driving the phenomenon you would like to understand.
What are the smallest changes without which the physical or chemical effect simply does not work?
In many situations, this identification is reasonably straightforward. For example, the thermal motion of atoms in any material gives rise to its entropy and to temperature-related properties such as heat capacity or thermal expansion. Understanding the origin of the underlying fluctuations at a nanoscale level typically requires atomistic simulations to study the motion of individual atoms with femtosecond resolution. This becomes slightly more difficult with metals where electrons can also carry a lot of the heat, suggesting that a quantum mechanical calculation would be required to treat the underlying electronic fluctuations. Alternatively, if you want to model the motion of heat in a material sample consisting of individual grains, then a larger scale, ‘representative volume element’ (RVE)-type of calculation would be appropriate. Of course, an all-atom simulation could take an inordinate amount of time without yielding much useful information.
Simulating the physical and chemical properties of composite materials is more complicated. Engineers are often interested in mechanical strength, weight and its statistical variation, which might be sufficient for part-level simulations. However, designing a material from scratch requires going back to the microscopic structure, e.g., studying fluctuations in atomic composition at the nanoscale. These fluctuations drive composite curing during manufacturing by determining which molecules cross-link and which ones remain unreacted. Hence, they play a defining role in establishing the material’s final properties. Understanding this process in detail requires atomic-level detailed simulations. You can then use the atomistic knowledge again in an RVE model to link to larger-scale thermal and mechanical properties that are interesting to engineers.
covers those aspects driven by microscopic thermal and electronic fluctuations. Our sister brands SIMULIA, CATIA and SOLIDWORKS provide world-class simulation software and expertise supporting larger-scale predictive simulations.