When $a \ne 0$, there are two solutions to (ax^2 + bx + c = 0) and they are $$x = {-b \pm \sqrt{b^2-4ac} \over 2a}.$$
\begin{align} \dot{x} & = \sigma(y-x) \ \dot{y} & = \rho x - y - xz \ \dot{z} & = -\beta z + xy \end{align}
[ \left( \sum_{k=1}^n a_k bk \right)^{!!2} \leq \left( \sum{k=1}^n ak^2 \right) \left( \sum{k=1}^n b_k^2 \right) ]
[ \mathbf{V}_1 \times \mathbf{V}_2 = \begin{vmatrix}
\mathbf{i} & \mathbf{j} & \mathbf{k} \\
\frac{\partial X}{\partial u} & \frac{\partial Y}{\partial u} & 0 \\
\frac{\partial X}{\partial v} & \frac{\partial Y}{\partial v} & 0 \\
\end{vmatrix} ]
[P(E) = {n \choose k} p^k (1-p)^{ n-k} ]
[ \frac{1}{(\sqrt{\phi \sqrt{5}}-\phi) e^{\frac25 \pi}} =
1+\frac{e^{-2\pi}} {1+\frac{e^{-4\pi}} {1+\frac{e^{-6\pi}}
{1+\frac{e^{-8\pi}} {1+\ldots} } } }
]
[ 1 + \frac{q^2}{(1-q)}+\frac{q^6}{(1-q)(1-q^2)}+\cdots =
\prod_{j=0}^{\infty}\frac{1}{(1-q^{5j+2})(1-q^{5j+3})},
\quad\quad \text{for $|q|<1$}.
]
\begin{align} \nabla \times \vec{\mathbf{B}} -\, \frac1c\, \frac{\partial\vec{\mathbf{E}}}{\partial t} & = \frac{4\pi}{c}\vec{\mathbf{j}} \ \nabla \cdot \vec{\mathbf{E}} & = 4 \pi \rho \ \nabla \times \vec{\mathbf{E}}\, +\, \frac1c\, \frac{\partial\vec{\mathbf{B}}}{\partial t} & = \vec{\mathbf{0}} \ \nabla \cdot \vec{\mathbf{B}} & = 0 \end{align}
Finally, while display equations look good for a page of samples, the ability to mix math and text in a paragraph is also important. This expression (\sqrt{3x-1}+(1+x)^2) is an example of an inline equation. As you see, MathJax equations can be used this way as well, without unduly disturbing the spacing between lines.
$E = mc^2$
( A_i = B_i + Ci \sum{k=0}^{i} D_k E^k )
\begin{eqnarray} A_i &=& B_i + Ci \sum{k=0}^{i} D_k E^k \ Fi &=& \int{-\infty}^{x_i} f(x) dx \end{eqnarray}
$\frac{w_x}{\sum_z x_z}$
$\frac{w}{\sum_{z} x_z}$
$x_\gamma = x_i$
$xi = x\gamma$
Cost function of logistic regression (revision):
$$J(\theta) = - \frac{1}{m} \sum{i=1}^m [ y^{(i)}\ \log (h\theta (x^{(i)})) + (1 - y^{(i)})\ \log (1 - h\theta(x^{(i)}))] + \frac{\lambda}{2m}\sum{j=1}^n \theta_j^2$$
For Neural Networks, it is:
$$ J(\Theta) = - \frac{1}{m} \sum{i=1}^m \sum{k=1}^K \left[y^{(i)}k \log ((h\Theta (x^{(i)}))_k) + (1 - y^{(i)}k)\log (1 - (h\Theta(x^{(i)}))k)\right] + \frac{\lambda}{2m}\sum{l=1}^{L-1} \sum_{p=1}^{sl} \sum{n=1}^{s{l+1}} ( \Theta{n,p}^{(l)})^2 $$