SVGD is a general purpose variational inference algorithm that forms a natural counterpart of gradient descent for optimization. SVGD iteratively transports a set of particles to match with the target ...
Abstract: Large-scale multi-objective optimization problems (LSMOPs) pose challenges to existing optimizers since a set of well-converged and diverse solutions should be found in huge search spaces.
Abstract: We present a robust FFT-based approach to scale-invariant image registration. Our method relies on FFT-based correlation twice: once in the log-polar Fourier domain to estimate the scaling ...
Training very deep neural networks requires a lot of memory. Using the tools in this package, developed jointly by Tim Salimans and Yaroslav Bulatov, you can trade off some of this memory usage with ...
† Department of Chemistry, Chemical Theory Center, and the Minnesota Supercomputing Institute, The University of Minnesota, Minneapolis, Minnesota 55455, United States ‡ Department of ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results