Please boost for reach among the blind community. Okay y'all, is it just me, or are the Meta RayBan glasses descriptions, even with detailed responses on in accessibility settings, still not very accurate? I mean it feels like they're using Llama 3.1 8B, a small model. Am I going more crazy than I already am? Am I missing some context engineering tricks? Like I don't get it. It said my coffee maker's filter basket was empty when it wasn't, said a cup of coffee was empty when it was about half full, then said the coffee cup was folded when I asked it it was full again, cause speech recognition still sucks I guess and AI can't work around that, and said a washing machine was beside the bathroom counter when it was behind me, across from the counter. Like this isn't me playing a video game, this is normal household stuff.