computer-graphics-mass-spring-systems icon indicating copy to clipboard operation
computer-graphics-mass-spring-systems copied to clipboard

Optimizing with respect to two variables (p and d)

Open jason-hn opened this issue 5 years ago • 1 comments

Hi, I am just a little confused why the two-steps work on a calculus-level:

First, is the output space of the expression convex? Otherwise, we might run into issues that more than one possible location at the next timestamp.

Second, how do we make sure that each time we minimize one of the two variables (LaTeX: d_{ij}d i j and LaTeX: p_ip i), the overall expression will get closer to any local min?

Thanks!

jason-hn avatar Dec 06 '19 06:12 jason-hn

The time steps are relatively small, so it’s unlikely we’ll find a “bad” local minimum. The optimization algorithm we’re using is known as block coordinate descent, which is similar to gradient descent and should converge for nice functions like what we’re working with.

On Dec 6, 2019, at 1:12 AM, Mingrui Han [email protected] wrote:

 Hi, I am just a little confused why the two-steps work on a calculus-level:

First, is the output space of the expression convex? Otherwise, we might run into issues that more than one possible location at the next timestamp.

Second, how do we make sure that each time we minimize one of the two variables (LaTeX: d_{ij}d i j and LaTeX: p_ip i), the overall expression will get closer to any local min?

Thanks!

— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub, or unsubscribe.

abhimadan avatar Dec 07 '19 13:12 abhimadan