Sweat the Details Podcast – Privacy, Security, Facial Recognition

For some reason, I rarely post about Nest Realty’s podcast, Sweat the Details, but I think I’m going to start posting more. The one we did last week is one that is timely, relevant, sort of about real estate, and definitely about society’s need to be curious about all things.

We welcomed Dr. Martiza Johnson, who  aims to make it easier for people to handle the security and privacy decisions they face in their daily lives. She completed her PhD in computer science at Columbia University and wrote her dissertation on the usability of Facebook privacy settings.

You can listen to this podcast here, and subscribe to the podcast here.

We talked about privacy, security, the difference between the two, smart homes and smart TVs, and the need for us all to be aware and curious about the technology that has fast become part of most of our daily lives.

You can listen to this podcast here, and subscribe to the podcast here.

Transcript

Jim:
This week on Sweat the Details, we welcomed Dr. Maritza Johnson, who aims to make it easier for people to handle the security and privacy decisions they face in their daily lives. She completed her PhD in computer science at Columbia University, and wrote her dissertation on the usability of Facebook privacy settings. We talked about security, privacy, the difference between the two smart home, smart TVs and the need for all of us to be aware and curious about the technology that is fast become part of most of our daily lives. Hope you enjoy the conversation. We certainly did.

Jim:
Hey everybody, is Jim Duncan with Nest Realty and Sweat the Details here with Keith Davis, my partner at Nest and Dr. Maritza Johnson. Marissa, thanks for joining us today. If you don’t mind giving us a quick background on who you are, what you do and sort of where we’re going to go with this conversation.

Maritza:
Yeah, thanks for having me. I’m super excited to be here. So my name is Maritsa. I’ve worked in what I call human centered security and privacy since about 2006. I’d say the thing that I’m very interested in is how do people, really just regular people make sense of all of the security and privacy decisions that you’re faced with on a daily basis. So back in 2006, this was, how do you manage your username and password for your bank login? How do you deal with all your financial institutions going online? What does it mean to get a phishing email? And then a few years later it was, how do you manage your Facebook privacy settings so that you know that the people who you want to see your content see it, but not the wrong people. A bit after that, it was how do you manage your online identities? You have a million usernames and passwords, how do you keep track of all that and make sense of it? Who knows what about you?

Maritza:
I guess the fun thing about my field is technology is always changing, we’re always seeing new technologies and there’s always space for questioning, what data does it involve? How do we secure it? How does it fit into my life? And what do I need to do to protect myself? So always looking at the new tech and asking kind of the same questions over time. So I’m super excited to have a chance to chat with a different audience from what I usually talk with since a lot of the time it’s academics for privacy advocates.

Jim:
I think that a lot of our audience is real estate based. I’d say, sort of a guest, that 65, 70% of our audience are in the real estate space. And Keith, will love my math, 25 to 40% of our audience are people outside of the field: consumers, clients, people who are curious about real estate. So as we were talking today about how we can frame security and privacy, from an actionable perspective, you mentioned that it’s not helpful to capitalize on fear, uncertainty and doubt. There’s a lot of fearful things in the online space on a daily basis, but how does someone think about privacy with all these scary headlines that we read about on a daily, weekly basis?

Maritza:
Yeah, that’s a great question. For me, the thing that you need to ask yourself, well, I guess there are a couple of truths to get comfortable with, the biggest being you’ll never have perfect security. You’ll never have perfect privacy. I think on privacy, the conversation is always interesting because privacy is more than just secrecy. Increasingly as our life goes online and gets digitized and our actions generate data, it’s not just about who can see the information, who has the information, it’s also how is that information being used.

Maritza:
And this idea of data use, I think is one of the most interesting questions we’re seeing right now. It comes up a lot lately around facial recognition. When you’re out in public, you can’t hide your face. I mean, it’s different today with COVID it changes all the time. Now we’re expected that you’re going to be covering your face, and that’s different adjusting to that. Four months ago, I would say facial recognition is a technology that’s being deployed in certain spaces. Airlines were starting to use it to board people into airplanes. And if your model of privacy is that privacy is secrecy, what do you do when your face is the thing that you’re trying to not disclose? We can’t do it. So then you have to look at data use. So I think that’s an interesting thing. I’ve already lost track of the question. Tell me again.

Jim:
With the frightening things that we read on daily basis and see on a daily basis, I’ll take this for a second, and going in public with facial recognition is ubiquitous, is the facial recognition and the privacy something that we need to get, not whether we need to or not, we’re going to have to get comfortable with in order to leave our houses. And is there anything we can do to protect our privacy, if not secrecy as we go outside?

Maritza:
Right. That’s right. You’re bringing it back to the actual question, which is, we have to get comfortable with, you cannot have complete control that these things will be deployed and then start asking questions about, well, who’s building it, where is it deployed and how is it used and where can I direct my concerns to then take actionable insights I think. Being against it outright is increasingly difficult because you don’t control all the spaces that you go into.

Maritza:
So kind of quickly jumping from the, do I like it or not? Or is it good or bad? Should it exist or not? These kinds of absolutes. You have to kind of drop those and then say, well, if we’re into a murkier space, then how do we think about that? So with facial recognition, you don’t have the question of like, should facial recognition exist as tech? You’re asking questions like, who should be able to use it and for what? When it is used, am I aware? Can I opt out of it? Can I look into the decision making? What are the nuances there? So I’d feel like some interesting questions would be like to get curious about it.

Keith:
So Maritza, let’s go straight to who’s using it and what the permissions are and whatnot. So we’ll start with the United States usage. You brought up that airlines were beginning to use it for confirmation of passenger identity to track who’s flying on what flights. In other countries, my understanding is they’re using it as well for, even in some shopping environments to automatically tag RFID what you’re walking out of a store with, without ever having to go through a line and other places. What are we seeing in the United States? Other than obviously the iPhone uses it for log ins, which we’ve kind of tacitly just handed over permissions on, but where else are we seeing it? What’s the growth area on that?

Maritza:
To be honest, I don’t completely know the extent of it. I see various things on Twitter suggesting even with the protests that are happening in early June, that it might be that police are using facial recognition to identify who’s at the protest and go after them later. I feel like I’ve seen a couple of credible claims that that’s actually happening.

Maritza:
Definitely, as you said, some of the major airlines are using it. Jim, when we were preparing for this podcast, you sent me a note on the ring cameras. I’m pretty sure I’ve seen some credible claims that there’s face recognition there from law enforcement going on. And I mean, with this question of, like I said earlier, I broke it down between data collection and data use. In the US at least, a lot of our protections and kind of rulemaking is around data collection. We don’t today have great technical systems for limiting and detecting and policing, or just regulating data use. I feel like we’re in a situation where if we actually had the ability to see into that, there might be uses of data that we’d be quite alarmed about.

Jim:
Let me jump on that. So you say that there are, you mentioned regulation of the data. I’ve seen reports about smart TVs, a lot of people have these smart TVs that Netflix is already built into the remote and Amazon’s already built in, whatever. The smart TV is something that are they just TVs or is this something we should be somewhat concerned about or at least aware of?

Maritza:
Smart TVs and really any smart device that you’re putting in your house, is a thing that you should be aware of. You should be curious about it and you should be asking questions. So interestingly, I did some research about two years ago, where we did a survey of smart TV owners asking questions to measure, what’s your attitude toward data collection and use via your smart TV? What’s your attitude and understanding of what’s happening? What are your expectations? And a little bit trying to compare that against like, what’s the reality?

Maritza:
So consumer reports did some excellent research looking at smart TVs and the data that is being collected per brand. I feel like people are pretty unaware of the sort of data collection that’s happening and the data use there. And then what was interesting from our survey is that we found roughly, I want to say 40% of people assumed that there were laws in the US that would protect them from their data being used in ways they didn’t expect. Now that’s notable. So if you think that you have a law that is protecting and basically restricting how data could be used, you kind of count that out as a thing that you’re not going to care about. You feel like you don’t need to wonder, you feel like it’s taken care of. That is not the case.

Keith:
In the same sense that once you install an antivirus software on your computer, you think you’re immune. And honestly, we’re looking at COVID and everyone’s talking about a vaccine at this point. There’s this assumption in lady’s mind that once we have a vaccine, we’re all fine. And I think every physician you talk to says, that’s not going to be the case, no matter how strong this vaccine is down the road.

Keith:
But I also move towards the fact that in facial recognition, we’ve had the technology for some time now, it may not have been automated, but we’ve been using it. If I look back a few years ago, and obviously we’re all in Charlottesville at this point, the August 12th, Unite the Right rally, those images of people’s faces who were marching in that rally were sent in crowdsource throughout the United States to identify who the perpetrators of violence were, who were then tracked down. Their employers were notified. They were fired from jobs.

Keith:
That may not have been automated, but that was absolutely facial recognition. I’m sure that nobody thought at the time, if I march I’m putting myself in jeopardy, I think now we are much more aware of what cameras do all over communities. And I think we are aware that if you’re involved in protests, those names are going to be recorded in some form, whether it’s through automated or through crowdsourcing. I mean, is there a difference other than the speed with which we can identify or the tracing and tracking of that recognition?

Maritza:
I feel like at all levels it’s different. To me at least, if I were to start to break it down, I would be looking at like, who is doing what matching? What’s the data that they’re using to train it and to what effect? So I feel like for the Charlottesville, the outright stuff. We generally agree, like Nazis are bad, that rally shouldn’t have happened. There’s a lot of violence that shouldn’t have happened.

Maritza:
So then it felt good to watch the mob justice go after and identify the folks. And then if you look at, what if a certain company were using facial recognition to automatically detect and alert law enforcement for things that they suspect would be crimes? It’s kind of the same effect, but then that’s across all facets, you wonder, well, how does this stack up? And which one are we okay with and why? And which one feels icky?

Keith:
Well, and Jim, you’ll probably know if this is still in use. I remember a few years ago in Charlottesville, we were dealing with police cars that had the license plate readers. There were actively monitoring every single plate they passed and GPS tagging every license plate to identify what patterns cars were taking and what routes they were taking, where they could be found. In the event that that car needed to be located six months from now during some arrest need. Are we still using that in Charlottesville? I mean, I’m assuming that’s still in use throughout the United States. I haven’t noticed them on the back of our cruisers, but-

Jim:
I think that they are, I’ve seen them, but I want to jump back for a second and just not for a scary headline, but my question is a scary headline question. What data are the smart TVs watching? Is it just the hours that we’re watching it, the shows are watching it, the eye tracking movements, things like that?

Maritza:
Yeah. When we set those up, Jim, you asked me to be positive. You asked me not to fit some of that. You asked me to focus on what can you do? How do you get involved? I think on your smart TV, the thing that I would say, if you remember one thing, it would be, be curious. So be curious about the devices that you’re buying and think about, just wondering, like what’s possible here, and you might find yourself asking certain questions, there’s no harm in asking questions.

Maritza:
With your smart TV, the answer to what is happening, depends on what the sensors are. The kind of like a basic smart TV, it’s basically your TV is kind of like a computer and is running apps on it that show you the content that you’re watching. So at a minimum, a smart TV may be tracking, which shows you watch, how often you watch them. It might be guessing how many different people are watching it. So let’s say you have, just to lean into stereotypes that you have, the dad of the house who’s watching sports all the time or at certain times. You have a mom of the house who’s watching Grey’s Anatomy all night. You have the kids who are watching in the day. It is not unusual, it’s pretty common for a smart TV to at least be collecting the data of what is being watched.

Maritza:
And then it’s a question of data use. So then for the parties who are able to see what you’re watching, what can you learn from this information? And I’m not saying that there are companies who are using what you watch to then make behavioral guesses and to kind of attempt to influence you or persuade and use it in marketing. But if you tie that into what we know about online advertising and things that you see on Facebook and things that you see on the online advertising, that’s like, if you know that somebody has started watching, I don’t know, if you see somebody who watches of sports, how do you advertise to them? If you see somebody who watches a lot of emotional content, what can you see about them? How would you use that to monetize it? There’s always the question of monetization.

Jim:
So it might not be that Facebook is listening to you on your iPhone, but all of our TVs are watching us and everything that we do. Okay, great. This is not making me feel good about this world.

Keith:
Which does bring instantly the whole Cambridge analytical question. That was one piece, Facebook may have been one piece, but add to that smart TV, add to that the GPS of your car, add to that your phone’s movement and every app you’re touching, it does start to paint a very complete picture.

Maritza:
One of my earliest kind of what’s going on with the smart TV things, as a family, back in 2017, 10 of us were all in the cabin in Tennessee. And my father-in-law, he loves old Disney movies, like the old, old Disney movies. And he was talking about one specifically. So it’s like, he’s on his phone. He’s talking about this movie. I’m sure one or more of us Googled it. And then like three days later, he saw on Facebook, a suggestion to add that he liked this movie to his Facebook profile.

Maritza:
And everybody’s like, of course asking me, how? What just happened? And you jumped to thinking like, Oh, the microphone on his phone is what gave it off? And I’m like, no, like that’s almost certainly not it. To be clear, there’s no evidence to suggest that your microphone is being used. I mean, and you can’t prove a negative. So like you can’t say for sure, it’s not happening, but you wonder. So I’m pretty sure what happened was that because our phones were all connected to the same wifi device, the same wifi router, that by IP address, I’m guessing something got tied to and flagged and some system picked it up. This is a reasonable suggestion.

Jim:
Which is an extraordinarily terrifying highlight of how tied and intertwined everything is. And the only way to not be tied in-

Maritza:
And that’s so amazing.

Jim:
Yeah. The power of networks is extraordinary because it will just tie everything together from-

Maritza:
And from human reasoning, it’s like, how do you wrap your head around that?

Jim:
It’s such a massive concept that just it’s been fun. From a smartphone perspective, when you move into a new house, you’re talking about the wifi, I’m taking this thread. When you buy a new house, you might go in, you have a nest thermostat, you have a water monitoring system that sees how much water you use. Some of these places using solar systems now or tracking using it. I want to say again, not concerned, but aware and curious about when someone buys a smart home with an air conditioner, for those who can’t see us, what questions should people asking? What level of curiosity or concern should they have when they’re going into something that’s been monitoring another family?

Maritza:
Right. Right. So I would say if you’re moving into a house and kind of inheriting a network, inheriting devices, or even if you’re making decisions about buying new devices. So I deal with both security and privacy. For most people, they’re one and the same. It’s like, I want to use cool tech. I don’t want to feel dumb for doing it. I want to have the benefits. I don’t want to be harmed. Help me feel safe.

Jim:
Real quick. What’s the difference? I mean, because you said security and privacy, and I’m like, same thing, what’s the difference.

Maritza:
This is a hotly contested debate. In kind of like a simple sense, security is about protecting the information. So security is about, you want the right people to have access to the data when they need it. And you want to keep the wrong people from having access to data. So that’s security. I love working on privacy, because I feel like it’s endlessly interesting, depending on who you ask, you’ll get a variety of definitions. To me, increasingly privacy is about power and it’s about, you want to be sure that all these things that are being digitized are basically being used in a way that benefits you and that you’re not being … I’m still working on my definition on this one. You don’t want your data to be used against you in ways that you can’t track down. You want to understand.

Jim:
So on that, and we’ll come back to the home in a second, but I’ve seen some things were talking about how your data rights or your privacy, what have you, are human rights?

Maritza:
Yes.

Jim:
I know it’s a huge question for a 45 second answer, but is that a thing that we should be curious about? I love your framing. You said something about which we should be curious about human rights and data rights.

Maritza:
Yes. Today I would be hard pressed to find somebody who thinks seriously about privacy, who would push back on care for privacy, care for data is a human right. That has to be true. There is so much power to be had, if you pretend as though data is not a thing to be managed.

Keith:
Is there a way to manage a process that is progressing so fast that nothing can keep up with it?

Maritza:
I don’t think you can manage, because I think it’s a really curious.

Keith:
You have your PhD in computer science, correct? And take no offense on the way I may have to phrase this, but is this something for computer scientists to be discussing or it’s-

Maritza:
I think it’s a societal conversation. It is hugely political. I actually feel like it’s a big deal that there are folks who still pretend like it’s apolitical.

Jim:
I mean everything’s political to a certain degree, but I think that it’s … God, I can’t even frame my question. It’s such a huge idea.

Keith:
They think it’s apolitical because they want no control over it? Or they want no one else to have control over it, and therefore they say it’s apolitical?

Maritza:
[crosstalk 00:23:19] maintaining the status quo and do not want to be held accountable.

Keith:
Unregulated.

Maritza:
Like with Mark Zuckerberg and Facebook refusing to take responsibility for what happens on the platform, that’s huge. Sorry, I maybe alienating some of your listeners if you-

Jim:
Not at all. No, it’s fine. I think it is something that it’s, one, is talkable, and two, it’s something there’s certain degree I think, take my ignorant position clearly versus yours is that if you have a platform, you’re responsible to a certain degree to what’s out there and not inflaming hatred and racism and all the bad things that we’re seeing in our world in one space, I think that there’s a certain degree of … Clearly that is not an apolitical concept of our society.

Maritza:
Yeah. I’d tie it back Keith, to the question that you posed about the use of facial recognition, where in one case it’s looking at a subset of footage and identifying those people. And then what I brought in, which was the automated and consistent and continual and perhaps permanent use of it. And I feel like the thing that changes there is scale. How big is this effect? How big is the platform? How much data is being collected? What’s kind of like the scale, the scope of the potential consequences?

Maritza:
20 years ago we felt like the internet was this like cool, not real life place, that existed side by side with real life. Like you invited me to talk today because we have these devices in our home. That is very private. Like that is a space that is traditionally not open to the surveillance. We would think hard before putting surveillance, inviting surveillance into our home usually, but today you don’t. Exactly.

Keith:
And yet we put nest cameras all over it. Absolutely.

Maritza:
We’re not being curious enough.

Jim:
A lot of it always goes back to [Duress 00:25:35] quote from me. Jericho Bloom’s statement, we spent so much time thinking about whether we can and not enough whether we should. I think the curiosity is a responsibility for all of us. That we lack that because of where we are. Keith?

Keith:
Well, I am actually just thinking Jim back and I don’t even know how many years it’s been probably seven since your eldest daughter went off to college, she’s been out for three years maybe, four years. So eight years ago, you and I spoke when she was going to college and you had made the remark to me that you said, remember when you’re on Facebook, every picture that your friends were putting out there through whatever social media there is, and Facebook was kind of it at the time, you had said your future boss will be able to pull that up by name and be able to put your name and be able to find every single thing.

Keith:
The fact that we’re now talking about the fact that that is absolutely a reality that we can tie together every platform, we can tie together every public post. That’s not a long time for us to have moved from us counseling kids on ways they might think about a use, to the fact that we’re now actually absolutely in that environment.

Jim:
Yeah. Now I’m all sorts of scared. I want to shut down the computer and start running my house with a hamster wheel. Thanks.

Maritza:
I know.

Keith:
Maritza, in tying this into kind of where the real estate market is, a few years ago, people began having cameras in their houses, they began having listening devices. We never thought of baby monitors that didn’t record as being listening devices, but they certainly always have been. Anything that allows a party to listen to a conversation that is not part of that conversation is considered wiretapping from a federal standpoint. So the presence of nest cameras, Arlo cameras, the ring doorbell is a little different because it’s outside the home where you don’t have an expectation of privacy or at least those are what the arguments are against that one being problematic.

Keith:
But within the real estate environment, we’re required to disclose, at least in Virginia, we require our agents, and through Nest we require our agents to disclose the presence of any listening device. These are now everyday technologies. I remember the first person I knew who said they wanted to put nanny cams into their house to be able to see when they were at work, how their children were being treated. This is everyday tech that is now absolutely recording and impacting our next day. And I think the question of is this scary? Yes it’s scary, but it’s scary in the sense that we do not … The scariest part of this in my mind is that we do not pay attention to what’s happening.

Keith:
We suddenly got terrified with Cambridge Analytica, but what has changed based on Cambridge Analytica? Nothing. Except that there now are even more companies probably purchasing and analyzing big data and understanding. There are articles that came out of Target who were identifying 16 year olds who were pregnant before they told their families. And that was based off of unscented hand cream. That was a single data point. We did in real estate, we’re identifying which of our clients are most likely to move based on their buying practices, their use of Zillow or whatever else. I guess the big question is, yes, it’s scary, but how do we increase people’s desire to have more information? I mean, isn’t that the scary part is that we don’t care?

Maritza:
Okay. So there are so many thoughts. Yes.

Keith:
There are a million thoughts in that.

Maritza:
I love it.

Keith:
Just love the way it roles with me.

Maritza:
I don’t think it’s that people don’t care. I think that people don’t know that they need to care. We all really need to care about this right now. We all need to care about the devices. We need to care about the companies behind them. We need to care the data, especially in the US. There are countries where you actually have a right to privacy. In Germany, you have a right to privacy. It is a fundamental human right. It is not so in the US.

Maritza:
And then I love your question because the way you teed it up reminded me of a thing that it’s easy to forget. In the US right now, your privacy protections are based on specific targeted laws that are for specific industries. So your health data under HIPAA, children’s data under FERPA and your video rental log, because some politician got busted renting porn, and now video rental logs are a thing to be protected for always. And we also have wiretapping. Yeah, it’s super goofy. It is very targeted, very specific. We have wiretapping laws, and that is being used to inform how we think about digital communications and digital data. It is not enough.

Maritza:
So you have like scattershot federal laws, you have scattershot state level, all these different things. One thing that I think has been super interesting over the past couple of years is that consumer reports has started to include security and privacy and data collection as items that they look for when they do product reviews. That’s incredible. Actually watched an academic talk about this just like a month ago. They have what they called the Digital Lab.

Maritza:
So they actually brought in like 20 of the leading smart TVs to hook it up to a router and sniff the traffic to see what data is being taken off that TV. And that’s how you’re getting answers right now. So that’s crazy to me, it’s crazy that like, that’s where we’re at in 2020, that you have to count on a lab to bring out all the bells and whistles to detect and try to piece together what’s happening. To me, that’s just like how unregulated it is.

Keith:
So you mentioned that consumer reports is doing that for some of the products. There are groups out there, like the B Lab that are doing social justice monitoring of companies and scoring their social impact. Is there anyone who is publicly ranking companies on their privacy protection for customers? Is there any any group out there doing this?

Maritza:
… has no idea of scorecards that they look at. I should be embarrassed to not just like roll this off the tongue. I think Electronic-

Keith:
Frontier Foundation, is it?

Maritza:
Thank you. Yes. Yes. Electronic Frontier Foundation, they do scorecards looking mostly at security and looking at different encryption practices and what you can count on there. They’re I think a leading voice in this space. Consumer reports is increasingly … I’m trying to think who else pops up is like a group you contrast. Journalists. Journalists do a lot of great work in the privacy space. Kashmir Hill, if you want to read great stories about companies doing interesting things with data, she’s like a top investigative journalist in the States. Yeah. I’m sorry. I don’t think I actually answered your question. There’s no good answer. Yeah.

Keith:
I guess the answer is there is no company out there doing public awareness campaigns of ranking. Frankly, the way Tesla cars might maintain driving [crosstalk 00:33:29]

Jim:
It’s not just Tesla, it’s all [crosstalk 00:33:31] but basically giant black box computers that have all of our stuff. Before we start going down the path of me getting a cabin in the woods. I lose my smart phone and just I’m having a rotary phone. Maybe. I’m going to ask Maritza a closed question, because I think the three of us could talk about this for hours and then that wouldn’t be enjoyable for us at the end of the day. So this podcast is sweat the details. We’ve talked about a lot of stuff today. What is one, if you get to pick one detail that you wake up everyday and you sweat is that one thing you focus on on a daily basis?

Maritza:
I think Keith might’ve hit it earlier. One thing I do sweat is this idea that people don’t care about privacy because they still use Facebook, because they still interact in the world, because they participate in society. Like while you’re here, and let’s say you’re talking over Zoom, clearly you don’t care about privacy, but that’s no, we don’t have good outlets for turning our care into action. So I’m going to spin this to say, sweat the details. Be curious. Dig into the details. Ask questions. Realize that not enough people are sweating the details on data protection and figure it out.

Jim:
Wow! Maritza Johnson, thank you so very, very much for enlightening and terrifying and educating us. I for one will be more curious, going about my daily business inside and outside the house. So thank you so much for spending the time with us.

Keith:
I have to say, I think when we talk to people every week on this show, and there are many people that I am fascinated to speak with, but few whose topic I think I could just spend days and weeks and months continuing to learn more about because it is, as Jim said, it’s terrifying, but this is awesome. I envy you’re getting to spend your life just studying this stuff and talking about it with other people. Because this is fun, brain trust type conversations that I’d love to learn from everybody else in your field.

Maritza:
Thank you. Thank you for having me.

Keith:
Thank you for your time.

Jim:
This is awesome. Thanks Maritza.

Keith:
Thanks for listening everyone.

Jim:
We hope you enjoyed the conversation. If you have any questions or comments, please send us an email at sweatthedetails@nestrealty.com. Until next time. This is Jim Duncan with Nest Realty and Sweat the Details. Thanks for listening.

(Visited 36 times, 1 visits today)

Leave A Comment

Your email address will not be published. Required fields are marked *