Which one of the following matrices has an inverse?
The inverse of a square matrix exists only if the matrix is non-singular, meaning its determinant is non-zero. A matrix is singular (determinant = 0) if its rows or columns are linearly dependent.
\(R_1 = 2R_3\). The rows are linearly dependent, so \(\det=0\).
\(R_2 = 2R_1\). The rows are linearly dependent, so \(\det=0\).
\(R_3 = 3R_1\). The rows are linearly dependent, so \(\det=0\).
Let \(C = \begin{pmatrix}1&4&8\\ 0&4&2\\ 1&2&4\end{pmatrix}\). \[\begin{aligned} \det(C) &= 1(4 \cdot 4 - 2 \cdot 2) - 4(0 \cdot 4 - 2 \cdot 1) + 8(0 \cdot 2 - 4 \cdot 1) \\ &= 1(16-4) - 4(-2) + 8(-4) = 12 + 8 - 32 = -12 \end{aligned}\] Since \(\det(C) = -12 \ne 0\), the matrix \(C\) is non-singular and has an inverse.
Answer: (c).
Let \(X\) be a discrete random variable that is uniformly distributed over the set \[S = \{-10,-9,\ldots,0,\ldots,9,10\}.\] Which of the following random variables is/are uniformly distributed?
A discrete random variable \(Y\) is uniformly distributed if all possible values in its range, \(R_Y\), occur with the same probability.
The set \(S\) has 21 elements (from \(-10\) to \(10\)). Since \(X\) is uniformly distributed, \(P(X=x) = 1/21\) for all \(x \in S\).
\(Y = X^{2}\): The range of \(Y\) is \(R_Y = \{0, 1, 4, \ldots, 100\}\). \(P(Y=0) = P(X=0) = 1/21\). \(P(Y=k^2) = P(X=k) + P(X=-k) = 2/21\) for \(k \ne 0\). Since \(P(Y=0) \ne P(Y=k^2)\), \(Y\) is not uniformly distributed.
\(Y = X^{3}\): The range of \(Y\) is \(R_Y = \{-10^3, -9^3, \ldots, 0, \ldots, 9^3, 10^3\}\). The function \(g(x) = x^3\) is one-to-one, so \(P(Y=y) = P(X=x) = 1/21\) for all \(y \in R_Y\). \(Y\) is uniformly distributed.
\(Y = (X-5)^{2}\): The range of \(X-5\) is \(\{-15, -14, \ldots, 5\}\). Since squaring leads to repeated values, \(Y\) is not uniformly distributed.
\(Y = (X+10)^{2}\): The values of \(X+10\) range from \(0\) to \(20\). The mapping \(g(x) = (x+10)^2\) is one-to-one for non-negative values, so \(P(Y=y) = P(X=x) = 1/21\) for all \(y \in R_Y\). \(Y\) is uniformly distributed.
Answer: (b), (d).
Which of the following complex functions is/are analytic on the complex plane?
A function \(f(z) = u(x, y) + jv(x, y)\) is analytic on the complex plane if the Cauchy-Riemann equations hold everywhere: \[\frac{\partial u}{\partial x} = \frac{\partial v}{\partial y} \quad \text{and} \quad \frac{\partial u}{\partial y} = -\frac{\partial v}{\partial x}\]
\(f(z) = j \text{ Re}(z)\): \(f(z) = jx\). Here, \(u(x, y) = 0\) and \(v(x, y) = x\). \(\frac{\partial u}{\partial x} = 0\), \(\frac{\partial v}{\partial y} = 0\). (Holds: \(0=0\)) \(\frac{\partial u}{\partial y} = 0\), \(\frac{\partial v}{\partial x} = 1\). (Fails: \(0 \ne -1\)) Thus, \(f(z)=j \text{ Re}(z)\) is not analytic.
\(f(z) = \text{Im}(z)\): \(f(z) = y\). Here, \(u(x, y) = y\) and \(v(x, y) = 0\). \(\frac{\partial u}{\partial x} = 0\), \(\frac{\partial v}{\partial y} = 0\). (Holds: \(0=0\)) \(\frac{\partial u}{\partial y} = 1\), \(\frac{\partial v}{\partial x} = 0\). (Fails: \(1 \ne -0\)) Thus, \(f(z)=\text{Im}(z)\) is not analytic.
\(f(z) = e^{|z|}\): Since \(|z| = \sqrt{x^2+y^2}\), the function is \(f(z) = e^{\sqrt{x^2+y^2}}\). \(u(x, y) = e^{\sqrt{x^2+y^2}}\), \(v(x, y) = 0\). \(\frac{\partial v}{\partial y} = 0\). \(\frac{\partial u}{\partial x} = \frac{x}{\sqrt{x^2+y^2}} e^{\sqrt{x^2+y^2}}\). For \(\frac{\partial u}{\partial x} = \frac{\partial v}{\partial y}\) to hold, we need \(\frac{\partial u}{\partial x} = 0\), which is only true when \(x=0\). Thus, \(f(z)=e^{|z|}\) is not analytic everywhere.
\(f(z) = z^{2} - z\): \(f(z)=(x+jy)^2-(x+jy) = (x^2-y^2-x) + j(2xy-y)\). \(u(x,y) = x^2-y^2-x\), \(v(x,y) = 2xy-y\). \(\frac{\partial u}{\partial x} = 2x - 1\), \(\frac{\partial v}{\partial y} = 2x - 1\). (Holds) \(\frac{\partial u}{\partial y} = -2y\), \(\frac{\partial v}{\partial x} = 2y\). (Holds: \(-2y = -2y\)) The C-R equations hold everywhere, confirming the function is analytic.
Answer: (d).
Consider the complex function \(f(z)=\cos z+e^{z^{2}}\). The coefficient of \(z^{5}\) in the Taylor series expansion of \(f(z)\) about the origin is ______ (rounded off to 1 decimal place).
The Taylor series expansion of \(f(z)\) about the origin is the sum of the Maclaurin series for \(\cos z\) and \(e^{z^2}\). The general Maclaurin series are: \[\begin{aligned} \cos z &= 1 - \frac{z^2}{2!} + \frac{z^4}{4!} - \frac{z^6}{6!} + \ldots \quad (\text{only even powers of } z) \\ e^w &= 1 + w + \frac{w^2}{2!} + \frac{w^3}{3!} + \ldots \end{aligned}\] The series for \(e^{z^2}\) is obtained by substituting \(w=z^2\) into the series for \(e^w\): \[e^{z^2} = 1 + z^2 + \frac{z^4}{2} + \frac{z^6}{6} + \ldots \quad (\text{only even powers of } z)\] Since both \(\cos z\) and \(e^{z^2}\) only contain even powers of \(z\), the coefficient of \(z^5\) in \(f(z)\) is \(0\). Answer: 0.0
The sum of the eigenvalues of the matrix \(A=\begin{pmatrix}1&2\\ 3&4\end{pmatrix}^{2}\) is ______ (rounded off to the nearest integer).
Let \(M = \begin{pmatrix}1&2\\ 3&4\end{pmatrix}\) and \(A = M^2\). 1. Calculate the matrix \(A = M^2\): \[A = \begin{pmatrix}1&2\\ 3&4\end{pmatrix} \begin{pmatrix}1&2\\ 3&4\end{pmatrix} = \begin{pmatrix} 1 \cdot 1 + 2 \cdot 3 & 1 \cdot 2 + 2 \cdot 4 \\ 3 \cdot 1 + 4 \cdot 3 & 3 \cdot 2 + 4 \cdot 4 \end{pmatrix} = \begin{pmatrix} 7 & 10 \\ 15 & 22 \end{pmatrix}\] 2. Find the trace of \(A\): The trace of \(A\), denoted \(\text{Tr}(A)\), is the sum of its diagonal elements: \[\text{Tr}(A) = 7 + 22 = 29\] Answer: 29
Let \(f(t)\) be a real-valued function whose second derivative is positive for \(-\infty<t<\infty\). Which of the following statements is/are always true?
The condition \(f^{\prime\prime}(t)>0\) for all \(t \in (-\infty, \infty)\) implies that the function \(f(t)\) is strictly convex everywhere.
\(f(t)\) has at least one local minimum. This is not always true. For example, \(f(t) = e^t\) has \(f''(t) = e^t > 0\), but no critical points.
\(f(t)\) cannot have two distinct local minima. This is always true. A strictly convex function can have at most one local minimum.
\(f(t)\) has at least one local maximum. This is never true. Since \(f^{\prime\prime}(t)>0\), any critical point must be a local minimum.
The minimum value of \(f(t)\) cannot be negative. This is not always true. For example, \(f(t) = t^2 - 10\) has a minimum value of \(-10\) at \(t=0\).
Answer: (b).
Consider the function \(f(t)=(max(0,t))^{2}\) for \(-\infty<t<\infty\), where \(max(a,b)\) denotes the maximum of a and b. Which of the following statements is/are true?
\(f(t)\) is not differentiable. \(f(t)\) is differentiable and its derivative is continuous. \(f(t)\) is differentiable but its derivative is not continuous. \(f(t)\) and its derivative are differentiable.
The function \(f(t)\) is: \[f(t)=(max(0,t))^{2} = \begin{cases} t^{2}, & \text{for } t \ge 0 \\ 0, & \text{for } t < 0 \end{cases}\] 1. Find the derivative \(f'(t)\): For \(t > 0\): \(f'(t) = 2t\). For \(t < 0\): \(f'(t) = 0\). 2. Check Differentiability at \(t=0\): The left-hand derivative (\(LHD\)) and right-hand derivative (\(RHD\)) at \(t=0\) are both \(0\), so \(f(t)\) is differentiable at \(t=0\). 3. Check Continuity of \(f'(t)\) at \(t=0\): The derivative \(f'(t)\) is: \[f'(t) = \begin{cases} 2t, & \text{for } t > 0 \\ 0, & \text{for } t \le 0 \end{cases}\] The derivative is continuous at \(t=0\) since \(\lim_{t\to 0^-} f'(t) = \lim_{t\to 0^+} f'(t) = f'(0) = 0\). Answer: (b).