An important part of the design of a new helicopter is the flight test, where data is collected to measure the forces on different components of the helicopter. Before a helicopter design is approved, stress tests are required to demonstrate that the design can withstand these forces over the expected lifetime of the helicopter. A physical stress test is costly and time consuming, and it can only investigate about a half-dozen forces. It is essential to choose the correct forces from the flight test, which will then be applied in the physical stress tests. The correct forces are the most extreme forces in directions that cause wear and tear. Choosing these forces is difficult, since millions of data points are measured during the flight test and the data points evolve in high dimensions.
Traditionally, Bell Helicopter uses expert knowledge to determine which data points are likely to be important. The CRM team proposed building a blind data reduction method with the aim of matching the selection achieved with expert knowledge. A research team consisting of a professor, postdoctoral fellows, undergraduate and graduate students worked with Bell Helicopter using ideas from convex analysis and computational mathematics to find the required extreme points. The results of the blind algorithm was a success, with the results agreeing with the experts.
These results led to successful physical stress testing on the new model of helicopter.
A province-wide electric energy grid requires the efficient transfer of electrical generation capacity to manage the many and varied demands from electrical users across the system. Researchers Jean-Claude Rizzi and Guy Vanier at Hydro-Québec/TransÉnergie are designing methods for optimizing dynamic transfer limits in high voltage electrical networks.
In collaboration with Dr. Michel Gendreau and students at Montreal’s Industrial Problem Solving Workshop, the team built an abstract model of the problem and developed optimization methods for solving it. By the end of the workshop, a heuristic algorithm was proposed and later implemented by two students who ultimately delivered a prototype of the software.
This prototype grew into the basic tool for the engineers designing network exploitation strategies at TransÉnergie. The Hydro-Québec researchers report the experience enabled them to make rapid progress in their work through stimulating exchanges with academic researchers, in a relaxed atmosphere. The IPSW experience was for them as productive as it was pleasant.
The heart of a quantum computer is a basic logic gate that has several inputs and outputs controlling a several basic quanta of information known as qubits. The research group at the Institute for Quantum Science and Technology (IQST), led by Dr. Barry Sanders at the University of Calgary, has developed a novel machine learning method for creating the optimal design of a three qubit gate. This method, known as Subspace-selective Self-adaptive Differential Evolution, proved to be computational intractable for larger numbers of qubits.
IQST presented this design problem at the 2015 PIMS Industrial Problem Solving Workshop held at the University of Saskatchewan. A four qubit design leads to a hard optimization problem in dimensions. A mathematical reformulation of the design problem led to an algorithmically simpler feasible region problem. The result is a computationally efficient algorithm that solves the design in a matter of hours of compute time, rather than weeks or months.
The result was a functional design of a four qubit gate, which had never been achieved before.
Distributed acoustic sensing devices (DAS) are built from long fibre optic cables that are interrogated using a laser and detector at one end, to sense micro vibrations at any point along the fibre. Such a cabling system can be a cost-effective method to collect data across many kilometers of commercial infrastucture.
In current oil recovery technologies, fluid flow and fracturing processes can be monitored with these fibre optic cables installed in a deep bore hole in the earth. The 2015 PIMS Industrial Problem Solving Workshop in Saskachewan supported a project to determine how single-component sensor data can be used to provide information about fracture hypocentres. Travel time, Eikonal equations, and least squares modelling of the data are all used to improve the resolution of the sensing equipment.
This research collaboration has developed into an NSERC-funded internship with Fotech, implementing novel signal processing algorithms for other DAS applications in monitoring pipelines, rail lines, and other linear assets in the field.
Managing a network of mines is a complex operation, as each mine has its own unique character resulting from a range of products at various capacities, with various amounts of space for on-site inventory storage, with various processing facilities to refine products, and with unique transportation access and costs.
The Potash Corporation, headquartered in the Province of Saskatchewan, has mines at locations around the world and supplies a global market for fertilizer and crop nutrients that are essential in modern agriculture.
Scheduling of shut-downs and starts-up of each potash mine, and adhering to a variety of labour and fiscal constraints, are critical in determining the network operations. Ultimately the goal of a commercial mining operation is to maximize profits over a sustained period of time, taking into consideration the constraints of operation while responding to opportunities in the marketplace. The researchers at Potash Corp presented a challenge to the participants of the 2015 PIMS Industrial Problem Solving Workshop, to come up with an effective software algorithm to aid in optimizing the operations of the mines. A key improvement was to provide feedback to the operators on how to adjust the algorithm on the fly, to provide useful and reliable results for real operations over a sustained period. The result was an interactive software tool used as a prototype for managing these distributed mine operations.
UBC Brock Commons, an 18-storey tall wood hybrid building
Canada’s wood products industry is investing to enable the use of wood in larger and taller buildings so that this renewable resource will play a greater role in providing environmentally friendly building solutions. As communities slowly transform to accommodate a growing population and changing demographics, these solutions are needed to selectively convert areas to larger buildings. To ensure the long-term success of this initiative, the industry is committed to ensuring the consistency in the performance of Canadian lumber in structural engineering applications, against a potential decline due to widespread changes to the wood fibre basket as a result of the changing climate’s impact and/or major regional disturbances such as forest fires or insect attack.
Commissioned by industry, FPInnovations, working with researchers in the UBC Statistics Department and staff in the Department’s Applied Statistics and Data Science Group (ASDa), developed the sampling framework and analysis tools for a long term lumber monitoring program. The program, which minimizes sampling costs while reducing the time to detection of any potential downward trend, is based on a novel longitudinal statistical technique for assessing trend. This program was piloted in a two-year cross-Canada study, which readily convinced the forestry industry of its merits and put Canada in the lead internationally.
Recognizing that improvements might be possible, the group’s membership and range of expertise was expanded with support from FPInnovations and two successive NSERC Collaborative Research and Development Grants, leading to a commitment of over one million dollars over a period of ten years. These grants now support the work of a large team of researchers from Simon Fraser University, UBC and FPInnovations. This team has made breakthroughs in applying fundamental statistical methods in the assessment of existing and new wood products, enabling the regulatory system to accommodate more innovative products from this sustainable resource.
The Michelin Tire Company requires the tires it produces to be very uniform in order to provide a quiet, smooth ride for the automotive consumer. The more uniform the tire, the quieter the ride and the more comfortable the road experience. A modern automobile tire is constructed from twenty or more layers of materials including an air-tight inner seal, layers of rubber, cords and steel belts, bonding agents and finally a surface tread to complete the assembly. The optimal alignment of layers and bonding components is critical to the performance of the assembled tire.
At a PIMS Industrial Problem Solving Workshop (IPSW) in Calgary, Michelin presented the problem of designing a rigorous testing procedure for the construction and analysis of their tires, ensuring optimal layering of components to produce a uniform tire. With a team of mathematicians and statisticians, a successful testing protocol was devised using advanced techniques in harmonical analysis, statistical experimental design, and Monte Carlo simulation. A second workshop in Vancouver extended the testing procedures to include the method of Good Lattice Points, accounting for non-harmonic frequency components in the tire non-uniformity, and optimally reducing select frequencies that have a marked effect on consumer comfort experience.
The company implemented this novel testing procedure into their factories and reported savings in the hundreds of thousands of dollars per year.