One thing you should have learned in class at some point is that larger problems do not always take the same amount of time to solve as small problems.
For example, how much time does it take to multiply two vectors, in an element-wise fashion?
N = 10000; x = rand(1,N); y = rand(1,N);
z = x.*y;
We would expect this to be roughly linear in the length of those vectors. Thus, double N, and it will take twice as much time to process. In the most basic sense, your computer does one thing at a time, so double the amount of work, and double the amount of time.
However, many operations are NOT linear in the size of the problem. For example, the matrix multiply X*Y below will require time that will be roughly proportional to the cube of N to complete.
N = 1000; x = rand(N,N); y = rand(N,N);
z = x*y;
So, double N, and it will take roughly 8 times as much work to do. You can show this to be effectively true by listing all the computations in a matrix*matrix multiply. (The use of multi-threaded processing, the BLAS for linear algebra, etc., changes some of that, but even so, think O(N^3) here for the number of computations in a matrix multiply.)
Now you are wanting to use mincx. It does an optimization. Bigger problems in an optimization are the same as what I showed above. They take more time, and the increase in time is NOT linear. In fact, at the very least, since it will use linear algebra to do those computations, I would expect the work needed will potentially increase with the cube of the number of variables. This is because as the problem size gets larger, the computations will probably be dominated by operations that takes O(N^3) time.
So if a 7 variable problem takes roughly 0.5 seconds, then for a problem with 5 times as many variables, at least SOME of the internal computations inside mincx will take 5^3=125 as much computational effort. So even without thinking about the computations inside mincx, I might predict a rough estimate of the time required for a 35 variable problem might be:
.5*5^3
ans =
62.5
So something on the order of a minute to complete. That mincx took only 30 seconds or so, is due to the fact that some of the internal computations do not scale as O(N^3).
Is there a faster alternative to mincx? Um, well, gosh. Maybe the author of mincx intentionally chose a slow way to code the algorithm. Yeah, right.
So while it is possible to solve your problem faster, that might require a faster computer. Or solve smaller problems. Or learn to use the parallel processing toolbox well, and get a good enough computer to handle it in stride. (You might need to buy that toolbox too, since it is not free.) Start your computer running overnight, and get a good nights rest while it works.
And, hey, it could be worse. Some problems grow exponentially with problem size in terms of the time required to solve them. That is true of many combinatorial problems. So O(N^3) is small potatoes.
Just wanting a big problem to run more quickly than it does is not sufficient. Big problems often take big time to solve. That your project might hinge on solving this is just a fact of life. People very often create far to large of a problem for them to solve. It happens all the time. Sometimes it just means you need to retrench, back up and solve a smaller problem that is mathematically tractable.
As far as your current problem goes, it does not seem at all impossible to solve. 9000 sub-problems, each of which takes 30 seconds to solve. So a couple of hours. Sorry, but I don't see the problem here. Start it running, and go read a book. I want to suggest one about numerical analysis, but "War and Peace" might be sufficient. :-) I've worked on problems that took weeks of computational time to solve.