We have studied the only the supermassive black hole binary black hole datasets in challenge 3.2. Our general approach has been to conduct the data analysis within the framework of Xspec (http://heasarc.gsfc.nasa.gov/docs/xanadu/xspec/index.html) a publicly available, and widely used in spectral analysis of high-energy astronomical data. The gravitational wave component is in the early experimental stage, and is not part of the standard publicly available software. For these gravitational-wave searches, we applied Xspec's standard fitting tools, primarily applying the tempered Metropolis-Hastings Markov-Chain Monte-Carlo (MCMC) algorithm. For this round we had modest goals, which we have only partially fullfilled. We made extensive use of the MLDC dataset generation pipeline codes with the hope to avoid excessive difficulties with conventional consistency for source parameters and LISA specifc, including the FastSpinBBH code and Synthetic LISA. The latter is code is somewhat ill-suited to likeihood estimation, intented as a general- purpose modeling tool for LISA rather than a rapid gravitational-wave response calculator. Though we intended this as a "zeroth" step to provide a baseline for verifying the consistency of a more efficiency with the MLDC pipelines, we have not yet succeeded in producing an efficient and compatible code. Therefore we utilized synthLISA directly for our black hole searches. Even so, we did not leave time to test the code thoroughly on the training data, an oversight which is likely to be evident in the challenge evaluation. Our slow code limited us to analyzing rather short samples of the data, each containing 1400-1500 frequency bins. For BH1 we studied a 500000 s segment of data, restricting initially to frequencies less than 1.5mHz, and then more narrowly to the frequencies which the signal was significant. A number of chains were run, and analyzed by hand in an effort to improve the fit. Our pedestrian approach, and the very limited time we allowed, left a significant residual with an estimated reduced $\chi^2$ of nearly 5, even over our limited data sample. We applied a similar approach to BH2, using 1.2 million seconds of data below 1.25mHz . In this case our fit is considerably worse, with reduced $\chi^2$ near 20. It is likely that the low frequency portions of this signal are non-negligibly impacted by BH3 which is merges only slightly later, with comparable signal frequencies. Though Xspec allows simultaneous searches for multiple sources, we made no attempt for a combined fit this time. We did eliminate a larger portion of the low-frequency part of the signal for most of the parameter search. The last merger we found was at a somewhat weaker signal, making it easier to fit. This time we estimated a reduced $\chi^2$ of about 1.1 for our sample taken from 1.5 million seconds with frequencies limited to below 0.8 mHz. In total our search utilized a number of short chains (<1000steps) and a few longer chains (up to 20000 steps) with a estimated total of less that 200000 likelihood evaluations conducted over 1.5 days on a MacBook Pro. We plan to continue development of out search tools with hope of a more credible effort next time.