Rob Kaczmarek

Rob Kaczmarek

As automotive engineers push back the boundaries of combustion, an increasing number of studies are being carried out in the virtual world. The use of in-cylinder computational fluid dynamics (CFD) provides insight on what is happening as the combustion process takes place, shedding new light on everything from thermal efficiency to pollutant formation. Software developer Convergent Science was formed in 1997 by a group of combustion specialists. The company's Converge CFD code has been developed with combustion simulation in mind. Convergent Science's director of global marketing, Rob Kaczmarek, gives us his take on what the future holds for combustion modelling.

Do you think the dieselgate scandal has changed the way that engineers approach combustion design?

I don't think dieselgate has changed the view of the engineers, but I think the OEMs as a whole will now be more stringent in their adherence to the regulations. If anything, this has highlighted the need for new techniques and fresh ideas to address the challenges we face.

With increasingly stringent emissions regulations on the horizon, do you think that designers are likely to make greater use of in-cylinder CFD?

Most definitely and we are already seeing more enquiries from customers for this. As the requirements get more critical, the margin for error gets smaller and that's likely to lead to more complex simulations. I see a shift towards more detailed chemistry, coupled with CFD, in order to generate more detailed and accurate analysis. To me that's really exciting as we are well placed here; we've always been able to do that in Converge and our fully coupled meshing and chemistry solver combined with the Adaptive Mesh Refinement (AMR) function.

What trends do you foresee in the way that automotive engineers will use in-cylinder CFD?

Being able to do things faster is going to be a major trend in this field. Recently we've been working with Intel to optimise our solution to perform on its new Knights Landing platform and Xeon Phi processor. That's going to significantly reduce the time it takes to solve complex equations. As a result, the speed-to-accuracy balance is going to improve even further.

As the hardware improves I think the industry will move away from model-based solutions, which are effectively approximations, and start to solve the chemistry directly. Models tend to be used at present where time is a constraint or where there simply isn't the processing power to run a full detailed chemistry analysis.

Soot production is a good example. It's such a complex phenomenon that in order to solve it with all the intricacies needed would require a massive supercomputer. That's why you have research institutions like RWTH Aachen producing mathematical models to shortcut the process. Over time, though, we'll see the number of phenomena that can be solved directly increase dramatically.

Conjugate heat transfer is also becoming more and more important as engineers attempt to capture the material characteristics in greater detail. We think we've come up with a good solution in the form of supercycling. This uses time-based spatially averaged temperature and heat transfer coefficients to perform steady-state heat transfer calculations – effectively allowing the simulation of the solid materials and the fluids to run on two different timescales concurrently.

Manufacturers are starting to contemplate more radical combustion concepts. Will these pose a problem for CFD modelling?

We're certainly starting to see a lot more novel ideas, like dual fuel combustion, opposed piston engines and two-stroke diesels. The industry as a whole is becoming more innovative, which just highlights the importance of getting the fundamentals right with CFD. You need a good simulation that's grid-convergent and isn't overly reliant on empirical models, otherwise there's a danger that you'll struggle to generate meaningful results once you introduce a major change.

By employing methods like AMR and direct chemistry in Converge we've been able to achieve very consistent results between different concepts. That means it's very robust to changes in geometry or chemical composition. We've seen organisations such as Universitat Politècnica de València finding it to be the most effective tool they've found for very advanced concepts such as the partially premixed combustion engine it has been working on.

With the rise in complexity of the modelling, can suppliers keep up with OEMs?      

It's a worry for sure for the smaller suppliers. The OEMs are looking at larger, more complex models and this is putting pressure on the tier ones and tier twos to ensure they have the capability. The great thing about modern networks is that you don't need that computing power in-house. We've seen a dramatic rise in the use of cloud computing over the last few years and I think that's something that's going to continue as we move forward.

The use of HPC for things like parametric design optimisation is going to become more prevalent and I think that will push the industry towards a more elastic hardware model.  In most cases you don't need a supercomputer, but what you do need is maybe the ability to harness a lot of cores for a short period of time.

Most CFD providers have some kind of on-demand option to make that happen. That allows the smaller suppliers to keep pace without necessarily having to invest in a large scale HPC system.  In our case, we've partnered with the likes of Rescale, R Systems and Penguin Computing - to name a few - to allow our clients to reach out with a cloud-based solution.

It's a really exciting time. That combination of smarter software, increased hardware capability and more efficient scalability across large core counts is allowing engineers to explore options that simply wouldn't have been possible a few years ago.