Unfortunately, only one discriminative method in classification theory, linear regression, has closed form solutions (linear discriminant analysis/fisher discriminant are generative, and even they have a closed form solution due to the great simplicity of the distributions fitted).

Even with linear regression, it is considered a wonder that it "works." As far as I'm aware, proving that "you cannot solve logistic regulation in closed form" is nearly impossible; yet, the widespread consensus is that this will never be the case. You can achieve it if your features are binary only and you have a small number of them (since a solution is exponential in the number of features), as demonstrated a few years ago, but it is thought to be impossible in most cases.

So, what made it so effective for linear regression? Because, once you've computed your derivatives, you'll find that the outcome is a set of linear equations, m equations with m variables, which we know can be solved directly using matrix inversions (and other techniques). When the cost of logistic regression is differentiated, the resultant problem is no longer linear... Because it is convex (thus global optimum), but not linear, present mathematics does not give us with powerful tools to identify the optimum in closed form solution.