Many problems in control and optimization require the treatment of dynamic systems in which continuous dynamics and discrete events coexist and interact, which is the focus of this talk. We present a survey of some of our recent work on such systems. In the setup, the discrete event is given by a random process, and the continuous component is the solution of a stochastic differential equation. Seemingly similar to diffusions, the processes have a number of salient features distinctly different from diffusion processes. After giving motivational examples arising from wireless communications, identification, finance, singular perturbed Markovian systems, manufacturing, and consensus controls, we present results on recurrence, necessary and sufficient conditions for the existence of unique invariant measure, stability, stabilization, and numerical solutions of control and game problems.