Scroll down to explore projects having tangible outcomes.
The project studied a method for learning probability densities as normalizing flows using only preference data. Some applications:
The project studied a new type of Bayesian optimization for learning user preferences or knowledge in high-dimensional spaces. The user provides input by choosing the preferred option using a slider. The method is able to find the minimum of a high-dimensional black-box function, a task that is often infeasible for existing preferential Bayesian optimization frameworks based on pairwise comparisons. Some applications:
Multi-fidelity Bayesian optimization (MFBO) integrates cheaper, lower-fidelity approximations of the objective function into the optimization process. The project introduces rMFBO (robust MFBO), a methodology to make any MFBO algorithm robust to the addition of unreliable information sources. Some applications: