Autodesk University, a ReCap
As a year-end wrap-up, I’d like to share some thoughts from the events I attended at this year’s Autodesk University. During the opening keynote (go ahead and watch, then come back), Autodesk CTO Jeff Kowalski described going “Outside” as important for the success of design firms. We need to alter our mindsets to embrace change. This may mean working with new team members from external firms, working with new disciplines in other industries and also by embracing technology and tools that were not necessarily designed specifically for our work. We need to re-imagine our work, our business structures and our lives. I really like the quote that he used to make his point:
The illiterate of the 21st century will not be those who cannot read and write, but those who cannot learn, unlearn and relearn.
At NBBJ, we call this Change Design. This deeply embedded philosophy in our practice enables us to create and innovate in ways not possible by only working with what you know. Working collaboratively with non-traditional consultants creates that ah-ha moment more consistently for our projects. As design practices regain strength after the last five challenging years, this idea of “outside” could not come at a more perfect time to go mainstream.
Along a similar thread, I decided to go outside my comfort zone when planning my conference. This was a very different AU for me, as it was the first time I didn’t stack up my class schedule with only those offered for my primary tool of choice. In 2002, that was Architectural Desktop, and in 2006 it switched completely to Revit. Seven years later, I am much more interested in Computational Design, and design technology leadership. Key to these interests, which thankfully align with my firm’s vision, were round-table discussions on leadership, fabrication classes, further exploring computational design tools like Dynamo and conceptual design tools such as Fusion 360, and the mighty Design Computation Symposium.
Design Computation Symposium
This is the fourth instance of this conference within a conference. The format this year was a half-day event emceed by Matt Jezyk of Autodesk. There were too many presentations to focus on in this article. I’d like to summarize more of what I saw in future writings.
Day one of AU, it became clear that Autodesk is committed to a computational workflow. Actually, on Day 0, the day before, there was a special day-long Dynamo workshop (which I missed) that looked at the node to code possibilities of embedding DesignScript within Dynamo. At the kickoff of the conference, the gravity of the situation hit when Carl Bass talked about Dynamo in a big way as a punctuation mark to the AU keynote for all 9,200 conference attendees, and 37,000 virtual attendees. He even kicked off the symposium personally for the gathering of approximately 150 attendees. The Design Computation Symposium presentations ranged from case studies from engineers, architects and fabricators to an inspiring closing keynote by Enric Ruiz-Geli of Cloud9 on the subject of “particles”.
A key takeaway from Enric’s talk being that sustainable design should be embedded in the project, finding innovative ways to reduce costs of structure, and assembly of the project to cove the first costs for solutions such as the Media-TIC building located in Barcelona. The active envelope filled with nitrogen clouds blocks glare and UV light, significantly reducing cooling costs and making it a more comfortable and dynamic space to be in.
Top 7 Digital Practice Trends
What direction is the industry moving that you need to pay more attention to? What will drastically change the way you design and deliver projects in the coming decade? While at Autodesk University, I attended the usual Keynotes, classes and ad-hoc sessions. I began spotting patterns in the innovative way people are working today, and based on glimpses that Autodesk and the vendors in the exhibit hall allowed peeks into our future, I’d like to share some observations.
Not all of these concepts are available or fully implementable today, and others are definitely ready for immediate use. I’d like to explore each of these in detail in future posts. Here’s what I’m keeping an eye on for the future of the Architecture, Engineering and Construction (AEC) industry.
BIM and Computational Design will rapidly converge together to become a single process. The fact that these separate silos exist is really only because of the tools of choice typically used by project teams. Design Computation is becoming necessary to realize/rationalize complexity in our designs, regardless of the project team’s formal aspirations. You don’t need doubly-curved surfaces to utilize computational design concepts. Since data and parametric behaviors are shared concepts in both approaches, it only makes sense that the dead end hand-off between the silos and tools dissolve.
BIM and Computational Design have traditionally been different tools, and required different mindsets. By these two mortal enemies coming together, everybody wins. The transition from design to documentation begins to blur and allow decisions to be delayed, enabling deeper design exploration, increased accuracy, & reduced costs.
Grasshopper, while inspiring the paradigm of visual programming employed by Dynamo, is at a disadvantage. Revit is a parametric design tool that understands what building components are, and Rhino certainly does not. While some find this troubling, even stifling creativity, I think Revit with Dynamo is the horse to bet on in the long run, and it’s the only tool in the BIM ecosystem that will span concept design, analysis, visualization, documentation and fabrication. Rich ecosystems and faster regeneration performance will come, and then the holdouts will come over to the dark side.
With projects like Dynamo and IFC translator trending toward Open-Source with direct support from a company like Autodesk, this will be an interesting space to watch. Since anyone who wishes to (and possesses the skills) can contribute code, you can mold the future to meet your own needs. Dynamo especially is receiving a great deal of attention in the industry as a way of extending the functionality of parametric design tools like Revit and (more recently) Inventor.
Scan to BIM, Photogrammetry, LiDAR, augmented reality… These are terms you should begin to hear a lot more about in the future. This technology is now so refined that you may not ever have to create record models showing an ‘As-Built’ or ‘As Constructed’ condition again. Instead, you can show ‘As Exists’ at this very moment using reality capture and incorporating with BIM tools. By democratizing reality capture, using a tool as simple as an iPhone or collecting data from multiple sources in Autodesk ReCap Photo, you will have the ability to see this information right in the context of Revit. The digital world can consume reality in ways that a tape measure and sketch pad never could. To find out more, watch the New Reality presentation by Tatjana Dzambazova.
Access to Resources
Increased access to robust digital design tools and infinite computing resources will continue to grow, and be at a lower overall cost. The impact of lowered barriers to accessing technology will be beneficial to both large and small firms. I wrote an earlier piece on the Death of the PC, which received a great deal of discussion on LinkedIn and Twitter. There exist strong feelings both for and against this coming change, which I feel is inevitable and a positive thing for designers and collaboration. Fear of change can certainly hinder adoption, whether legal, cultural or embedded workflows push back against it. The technology preview launched in November has had great adoption, and has implications beyond the use of Autodesk design tools. It could also affect the future of gaming.
Design to Fabrication
With more access to CAM tools, designers are becoming fabricators. Rapid Prototyping is becoming commoditized through technologies like desktop CNC machines and 3D printers. While simultaneously, physical mock-ups, often expensive to produce are now easier to create digitally, easier to experience and understand with virtual reality gear like Oculus Rift, or prototype with the many Autodesk cloud hosted tools like 123D Make and the newly announced for Beta testing: CAM 360.
Real-time collaboration and communication will replace asynchronous, inefficient processes. Concurrent Design, Analysis and Visualization will be a reality in the not too distant future. This is especially true if the soon to be released technology preview of Autodesk Showcase 360 looks as good as it does in this teaser video. Could you imagine being freed from the constraints of design and rendering being two silos of activity, often two specialized applications and two sets of hands?
Showcase, the desktop application, is currently an interactive presentation tool, and it’s doubtful the cloud version will be much different at first. Could you imagine how much more productive you would be if the design tools became as fast and interactive at showing physically accurate lighting, textures and reflections? Pixel-based shaders that use the massive power trapped in a GPU already exist inside Revit as part of the ‘Ray Trace’ visual style. It’s just a matter of time before live rendering is possible in working views, constantly updating as you design. Expect simulation and analysis to follow soon after, perhaps as quickly as the next 2-3 years. This will be a holy grail of advanced computing resources and truly allow enhanced communication with our clients as we share our design ideas.
This last one is a favorite, and came from a discussion with Enric from Cloud9 after his presentation. Design Computation and Sustainable Design as terms will fade as they become deeply embedded in our work. Analysis as a feedback loop to inform design and real-time dashboards will be expected on all projects.
One example of sustainable design feedback in the design environment is the Revit Daylighting Analysis plug-in technology preview available now. This will help you document and visually check for LEED IEQc8.1 2009 compliance.
Another excellent candidate for embedded workflow, proper Interoperability will enable teams to collaborate more effectively and glide between tools effortlessly to enable posing specific hypotheses to test against the project. Moving geometry between tools is trivial. Moving data between tools is key. With the latest IFC (version 4) pending, this looks to be closer to reality than previously thought possible.
When these three key concepts (Design Computation, Sustainable Design and Interoperability) become commoditized, the terms lose their power. Then, maybe we won’t need specialized symposiums or conferences on these topics, they will just be the table stakes of our core design practice. Then, I can retire happy.
A Knight travails under the idea that he/she is striving for a world where they are no longer needed.
Thanks to Shawn Foster of Black and Veatch for that closing thought during my final session of the conference – Design Technologists and Their Impact On The Organization. I hope this article was impactful on your work. Let me know what you think in the comments.