Submitted by AutoModerator t3_zcdcoo in MachineLearning
trnka t1_iz9ol1s wrote
Reply to comment by still_tyler in [D] Simple Questions Thread by AutoModerator
> one record close to another in x, y, z will likely have a similar outcome
That sounds a lot like k-nearest neighbors, or SVM with RBF kernel. Might be worth giving those a shot. That said, xgboost is effective on a wide range of problems so I wouldn't be surprised if it's tough to beat. Under the hood I'm sure it's learning approximated bounding boxes for your classes.
I haven't heard of CNNs being used for this kind of problem. I've more seen CNNs for spatial processing when the data is represented differently, for example if each input were a 3d shape represented by a 3d tensor rather than coordinates.
still_tyler t1_iz9prl7 wrote
Yeah, XGB still outperforms knn and svm here. There's a bunch of other non-coordinate covariates that contribute and XGB just kicks butt in this case. Fair enough, thanks for the response!
Viewing a single comment thread. View all comments