Originally Posted by
crimsondevil
I played with it a bit a few weeks ago - it was pretty decent at coherently explaining random biological processes, although there was often a small piece that was incorrect. With some back and forth, it would eventually realize its mistake.
I then had a hilarious discussion with it about how realistic the asteroid in the movie Armageddon was. It gave the (apparently) in-movie specs for the asteroid and then got very confused about the amount of damage it would have done - it didn't seem to understand the importance of mass as opposed to volume, and it was totally incapable of determining the density of the asteroid and comparing it to a real-life asteroid until I led it through it. (FYI the asteroid in Armageddon is way under-massed (500 billion tons or something) compared to its stated size ("Texas"), giving a ludicrously low density.) It eventually "got" it, but it took a long time.
As a college prof, I think it will be an increasing problem, although mainly for papers and other such writing assignments. I fed it a couple of "short answer" questions from a recent exam of mine, and it was hit or miss. It got one question correct, fairly impressively, then totally screwed up another that was related to the first.
I also asked it to write me a paragraph explaining the significance of one of my lab's research topics, and it was... okay, but not anything great. To me, it sounded like someone trying but who didn't really have any in-depth knowledge about the subject. It is impressive how it spat it out in <30 seconds though.