The overall purpose of this seminar is to bring together people with interests in Computer Vision theory and techniques and to examine current research issues. This course will be appropriate for people who already took a Computer Vision graduate course or already had research experience in Computer Vision. To enroll in this course, you must either: (1) be in the Ph.D. program or (2) receive permission from the instructors.

Each seminar will consist of multiple short talks (around 15 minutes) by multiple students. Students can register for 1 credit for CSE656. Registered students must attend and present a minimum of 2 talks. Registered students must attend in person. Up to 3 absences will be excused. Everyone else is welcome to attend.

The seminar will be taught by Prof. Chao Chen, chao.chen.1@stonybrook.edu.

The overall purpose of this seminar is to bring together people with interests in Computer Vision theory and techniques and to examine current research issues. This course will be appropriate for people who already took a Computer Vision graduate course or already had research experience in Computer Vision. To enroll in this course, you must either: (1) be in the Ph.D. program or (2) receive permission from the instructors.

Each seminar will consist of multiple short talks (around 15 minutes) by multiple students. Students can register for 1 credit for CSE656. Registered students must attend and present a minimum of 2 talks. Registered students must attend in person. Up to 3 absences will be excused. Everyone else is welcome to attend.

The seminar will be taught by Prof. Chao Chen, chao.chen.1@stonybrook.edu.

The overall purpose of this seminar is to bring together people with interests in Computer Vision theory and techniques and to examine current research issues. This course will be appropriate for people who already took a Computer Vision graduate course or already had research experience in Computer Vision. To enroll in this course, you must either: (1) be in the Ph.D. program or (2) receive permission from the instructors.

Each seminar will consist of multiple short talks (around 15 minutes) by multiple students. Students can register for 1 credit for CSE656. Registered students must attend and present a minimum of 2 talks. Registered students must attend in person. Up to 3 absences will be excused. Everyone else is welcome to attend.

The seminar will be taught by Prof. Chao Chen, chao.chen.1@stonybrook.edu.

The overall purpose of this seminar is to bring together people with interests in Computer Vision theory and techniques and to examine current research issues. This course will be appropriate for people who already took a Computer Vision graduate course or already had research experience in Computer Vision. To enroll in this course, you must either: (1) be in the Ph.D. program or (2) receive permission from the instructors.

Each seminar will consist of multiple short talks (around 15 minutes) by multiple students. Students can register for 1 credit for CSE656. Registered students must attend and present a minimum of 2 talks. Registered students must attend in person. Up to 3 absences will be excused. Everyone else is welcome to attend.

The seminar will be taught by Prof. Chao Chen, chao.chen.1@stonybrook.edu.

I will be holding an informal 2-week short optimization course, to try
to cover a few important proofs in the field. The goal will be depth
over breadth, with focus on:

 - convergence proofs for gradient descent and stochastic gradient descent
 - energy functions and continuous time optimization
 - estimate sequences and Nesterov acceleration

and, time permitting, additional topics like variance reduction,
quasi-Newton methods, and Frank-Wolfe methods. If we go super fast, we
can spend a few days at the end brainstorming interesting research
project ideas.

Details: NCS 220 6:15pm-7:45pm, Monday-Friday, Feb 7-Feb 18.

In person only, since I plan to use the whiteboard (but may be recorded)

More details will be uploaded here (notes, specific schedule):
https://sites.google.com/view/optimization-short-course/home

I will be holding an informal 2-week short optimization course, to try
to cover a few important proofs in the field. The goal will be depth
over breadth, with focus on:

 - convergence proofs for gradient descent and stochastic gradient descent
 - energy functions and continuous time optimization
 - estimate sequences and Nesterov acceleration

and, time permitting, additional topics like variance reduction,
quasi-Newton methods, and Frank-Wolfe methods. If we go super fast, we
can spend a few days at the end brainstorming interesting research
project ideas.

Details: NCS 220 6:15pm-7:45pm, Monday-Friday, Feb 7-Feb 18.

In person only, since I plan to use the whiteboard (but may be recorded)

More details will be uploaded here (notes, specific schedule):
https://sites.google.com/view/optimization-short-course/home

I will be holding an informal 2-week short optimization course, to try
to cover a few important proofs in the field. The goal will be depth
over breadth, with focus on:

 - convergence proofs for gradient descent and stochastic gradient descent
 - energy functions and continuous time optimization
 - estimate sequences and Nesterov acceleration

and, time permitting, additional topics like variance reduction,
quasi-Newton methods, and Frank-Wolfe methods. If we go super fast, we
can spend a few days at the end brainstorming interesting research
project ideas.

Details: NCS 220 6:15pm-7:45pm, Monday-Friday, Feb 7-Feb 18.

In person only, since I plan to use the whiteboard (but may be recorded)

More details will be uploaded here (notes, specific schedule):
https://sites.google.com/view/optimization-short-course/home

I will be holding an informal 2-week short optimization course, to try
to cover a few important proofs in the field. The goal will be depth
over breadth, with focus on:

 - convergence proofs for gradient descent and stochastic gradient descent
 - energy functions and continuous time optimization
 - estimate sequences and Nesterov acceleration

and, time permitting, additional topics like variance reduction,
quasi-Newton methods, and Frank-Wolfe methods. If we go super fast, we
can spend a few days at the end brainstorming interesting research
project ideas.

Details: NCS 220 6:15pm-7:45pm, Monday-Friday, Feb 7-Feb 18.

In person only, since I plan to use the whiteboard (but may be recorded)

More details will be uploaded here (notes, specific schedule):
https://sites.google.com/view/optimization-short-course/home

I will be holding an informal 2-week short optimization course, to try
to cover a few important proofs in the field. The goal will be depth
over breadth, with focus on:

 - convergence proofs for gradient descent and stochastic gradient descent
 - energy functions and continuous time optimization
 - estimate sequences and Nesterov acceleration

and, time permitting, additional topics like variance reduction,
quasi-Newton methods, and Frank-Wolfe methods. If we go super fast, we
can spend a few days at the end brainstorming interesting research
project ideas.

Details: NCS 220 6:15pm-7:45pm, Monday-Friday, Feb 7-Feb 18.

In person only, since I plan to use the whiteboard (but may be recorded)

More details will be uploaded here (notes, specific schedule):
https://sites.google.com/view/optimization-short-course/home