Simulation in Minutes!
GURU gen 1
January 25, 2019
In 1977, the NASA report Space Settlements - A Design Study published a grand vision of the future where among many accomplishments there would be space stations orbiting earth devoted to beaming down plentiful concentrated solar energy. The report stated: “power stations are placed in orbit around the Earth to which they deliver copious and valuable electrical energy. The economic value of these power stations will go far to justify the existence of the colony and the construction of more colonies.” Thirty Five years ago, the famous writer and futurist Isaac Asimov wrote predictions for 2019 in the Toronto Star. One of them echoed the 1977 NASA report, saying a “major part of Earth’s energy will come from the sun…energy will be so necessary to all and so clearly deliverable only if the nations remain at peace and work together, that war would become simply unthinkable.” These authors were joined by many others over the years in predicting a bright future. While there have been many advances since the seventies and eighties, and solar power has become a practical and important source of energy for the world, we are nowhere near having achieved these goals. Instead, today we are still using the fossil fuels that so many predicted should have become obsolete by now. In fact, we’re still subsidizing fossil fuels, and by 2050, the United States will have underwritten the drilling of an extra 17 billion barrels of oil, enough to emit over 6 billion tons of carbon dioxide. So, were these authors three and four decades ago delusional? What went wrong?
The United States is a world leader in supercomputing or High Performance Computing and yet almost nobody uses it.
We stopped using pay phones and video stores because the alternatives were so much better, and the same will happen to fossil fuel energy. The only question is how quickly can we get there. Engineers design everything from bridges, to airplanes, to the internet, and every one of those advances they achieve makes our lives better and grow the economy. No matter whether you are most concerned about breathing clean air, the future of the climate and sea level rise, or if you are primarily concerned about economic growth, having plentiful renewable energy would benefit us all. We all collectively stand to gain if engineers increase the pace at which they can achieve new technological feats.
Among the vast accomplishments of Leonardo da Vinci, were the following inventions credited with passing into general practical use: the strut bridge, the automated bobbin winder, the rolling mill, the machine for testing the tensile strength of wire and the lens-grinding machine. Nikola Tesla, famous for developing alternating current, also achieved advances in wireless communication, lasers, x-rays, radar, lighting, and robotics. These are two of the most prolific multidisciplinary inventors and while many think of them as aberrations, consider the question: what would it take to enable a large percentage of engineers to dramatically increase their abilities to invent?
Supercomputers are computers that demonstrate a high level of performance compared to general purpose computers?—?and that performance (ranked by TOP500) has been increasing for decades. A common performance measure is floating-point operations per second (FLOPS), which in very basic terms means the number of mathematical operations useful in science and engineering that can be performed in a second. Two decades ago the worlds top supercomputers like the Hitachi SR2201 were performing at hundreds of billions of FLOPS (hundreds of GigaFLOPS) and today iPhones are reporting this kind of performance. While the rare researcher who had access to this computing power twenty years ago would’ve told you they have no need for a supercomputer of their own, hundreds of millions of worldwide users today demonstrate they have all sorts of use cases. Modern supercomputers offer performance in the PetaFLOPS range?—?two factors of a thousand increase in computing power from GFLOPS to PFLOPS. The Oak Ridge Leadership Computing Facility (OLCF) is famous for being a world leader in advancing supercomputing, and it’s a breeding ground for key scientific research and technological advancement. In 2012 OLCF’s Titan supercomputer came online, and has a peak performance of 27 PFLOPS. While OLCF continues charting a course toward Exascale computing with their new Summit system, Petascale High Performance Computing (HPC) is now available to consumers, who can rent time on cloud systems offered by companies like Google Cloud, Amazon Web Services, IBM Cloud, and Microsoft Azure. Newer providers like Rescale even offer solutions to give engineers access to simulation software on these cloud platforms.
A small team of engineers will work a concept and system development, but there are many specialty areas that must be covered to successfully complete a project and bring a new invention to market. Examples include structural design and analysis; thermal; fluid; vibration; electrical; and many others. Humankind has developed the knowledge to do all these things, but small teams can’t afford to take years to go and get training on a hundred or a thousand skills. Companies with large budgets can afford the best tools and to hire the experts. This is already a 10X advantage. And those companies will also deploy them on HPC for another 10X advantage?—?so there can be 100X differences in engineering productivity between large and small companies. The United States is a world leader in supercomputing or High Performance Computing and yet almost nobody uses it. Only 8% of the Small- and Medium-sized Enterprise businesses (SME’s) that are designing and manufacturing new products have been using HPC, and the reason is it’s too complicated!
There are more than a thousand different tools that engineering software companies have designed for specialist engineers and when an engineer can leverage these tools it gives them new design, analysis, manufacturing, and efficiency options that can radically improve the company’s competitiveness. When we’ve interviewed many vendors of such Computer Aided Engineering (CAE) tools, they tell us they might have customers in the range of 200 or maybe 1000 worldwide. There are hundreds of thousands of companies that should be using the tools, and when asked why they don’t have 100 times more customers, the consistent answer is 1) the tools are too hard to use, and 2) they’re too expensive. We can’t afford to lose time because the best tools are too complicated to use. The expertise-barrier in engineering has been identified by many organizations lately. The Department of Energy raised the concern with CAE, saying “Computational Science has become the third pillar of science… Despite the great potential …HPC has been underutilized …hurdles remain for wider adoption especially for small and medium sized manufacturing and engineering firms.” NASA set goals for the year 2030 and said the solution to this problem must “minimize user intervention…single engineer/scientist must be able to conceive, create, analyze, and interpret a large ensemble of related simulations in a time-critical period (e.g., 24 hours)…full automation is essential.”
And it’s not just CAE that needs to be made much more accessible: The industrial sector consumes a third of the energy in the United States, and yet they’ve done about the least to implement energy efficiency measures. The latest IPCC report makes an obvious call for energy conservation. We are far from achieving 100% renewable energy, and demand continues to increase so maximizing efficiency is a must. Performing energy audits of a manufacturing plant is done today by hiring outside contractors. No one employed at the plant is tasked with energy efficiency work?—?they don’t have the time or the expertise to do it, and their main focus is getting the products made on time, on budget, and at a high quality. The Department of Energy recently identified the importance of solving the expertise-barrier to Energy Efficiency, finding that three quarters of employers reported difficulty hiring qualified Energy Efficiency workers. The California Energy Commission recently confirmed the concerns. There are dramatic levels of Green House Gas emissions that could be avoided if this expertise-barrier was solved, and the Energy Efficiency business is projected to be a trillion dollar endeavor.
Today, cloud High Performance Computing (HPC) has become powerful and affordable enough, and Artificial Intelligence (AI) has become mature enough, that we can solve this gulf that exists in many companies’ ability to use the best engineering tools and run them on cloud HPC. There is about to be an explosion in AI assistants across multiple disciplines and companies adopting AI will gain massive advantages in productivity.
That’s why MSBAI started developing GURU?—?The Ultimate Engineering AI Assistant. It’s like J.A.R.V.I.S. in the Iron Man movies. GURU is solving widespread expertise-barrier problems in engineering, by delivering specialized capabilities as a service. We leverage modern High Performance Computing (HPC) & Artificial Intelligence (AI) to do it.
GURU runs a client-server connection from the user’s device to services, HPC, and the ‘Agent Society’ containing libraries of agents that perform individual tasks. An end-to-end engineering workflow is accomplished by running a series of agents collaborating together.