Abstract
                                                                        Abstract— Driver  inattention  is  one  of  the  leading  causesof  vehicle  crashes  and  incidents  worldwide.  Driver  inattentionincludes driver fatigue leading to drowsiness and driver distrac-tion, say due to use of cellphone or rubbernecking, all of whichleads  to  a  lack  of  situational  awareness.  Hitherto,  techniquespresented  to  monitor  driver  attention  evaluated  factors  suchas  fatigue  and  distraction  independently.  However,  in  orderto  develop  a  robust  driver  attention  monitoring  system  allthe  factors  affecting  driver’s  attention  needs  to  be  analyzedholistically. In this paper, we presentAutoRate, a system thatleverages  front  camera  of  a  windshield-mounted  smartphoneto  monitor  driver’s  attention  by  combining  several  features.We  derive  a  driver  attention  rating  by  fusing  spatio-temporalfeatures  based  on  the  driver  state  and  behavior  such  as  headpose,  eye  gaze,  eye  closure,  yawns,  use  of  cellphones,  etc.We  perform  extensive  evaluation  ofAutoRateon  real-world driving data and also data from controlled, static vehiclesettings with 30 drivers in a large city. We compareAutoRate’sautomatically-generated   rating   with   the   scores   given   by   5human annotators. Further, we compute the agreement betweenAutoRate’s  rating  and  human  annotator  rating  using  kappacoefficient.AutoRate’s automatically-generated rating has anoverall agreement of 0.87 with the ratings provided by 5 humanannotators  on  the  static  dataset.