I was wanting to simulate a phase-frequency-detector circuit that uses N-MOS and P-MOS, but ran into a brick-wall with the simulator giving minimum-time-step errors. I think this might be a convergence problem. This occurs even on the simplest NMOS inverter (one-transistor) circuit making the mosfet circuit modeling unusable. Any workarounds or suggestions? Attached is an example schematic. I tried changing some of the mosfet model parameters (Used the Typical Values from Semiconductor Spice Modeling with SPICE textbook) but I could not get anything to work for a transient simulation.