Unnamed: 0
int64 0
3k
| title
stringlengths 4
200
| text
stringlengths 21
100k
| url
stringlengths 45
535
| authors
stringlengths 2
56
| timestamp
stringlengths 19
32
| tags
stringlengths 14
131
|
---|---|---|---|---|---|---|
0 | Data Sovereignty Is Complex: Ask These Five Questions Before Moving Your Data to the Cloud | Guest Post by Rusty Chapin, Manager of DevOps, BitTitan
It’s not hard to see why more and more organizations are moving to the cloud. Given the shift to remote work during the pandemic, the benefits are clear — robust collaboration and engagement among remote employees, as well as the ease of sharing and securing data. But as organizations look to move workloads to the cloud, there is one often-overlooked issue they should be aware of: data sovereignty.
Put simply, data sovereignty is the idea that an organization’s data is subject to laws and other governance structures of the region or nation where it’s collected. When migrating data to the cloud, data sovereignty plays an important role in how and where the data is moved. Knowing what questions to ask potential vendors prior to migration is the key to data sovereignty compliance.
Before we jump into these questions, we should understand what makes data sovereignty so important. First, it’s a security issue with serious ramifications for noncompliance. Medical, financial, educational or governmental data have specific requirements about how and where the data is stored. In some instances, data can’t leave its country of origin. For instance, government agencies’ documents and emails cannot be processed on another country’s servers.
Noncompliance with data sovereignty initiatives can mean hefty fines, legal action against your organization, or even jail time in the cases of federal laws like HIPAA and FERPA. It’s vital to know exactly where your data goes and how it’s secured during migration. Asking these five questions before choosing a vendor can ensure the move complies with growing data sovereignty initiatives.
What happens to my data while it’s in transit?
Ideally, there are no intermediary stops for your data during migration. If the potential vendor tells you that your data will temporarily be stored in a private data center on its way to its destination, that should raise red flags. Such data centers are likely not as sound and secure as a public cloud offering.
These data centers might also be outside the region or country prescribed by data sovereignty regulations, which could result in noncompliance issues. It’s important that your vendor understand the different certifications and compliance issues they must adhere to with your data. Clearly communicating the various data sovereignty requirements with potential vendors is an important first step.
Does the vendor store my data during and after migration?
Additionally, the vendor shouldn’t store your data in any form. This includes writing it to a disk or keeping it in a database. Instead, they should be handling all your data in application memory. During the migration, any point where your data stops is a risk. It’s better to avoid those risks if possible.
How should I encrypt my data during migration?
Sometimes there are bottlenecks during migrations. For example, maybe the destination is a little slower than the source, and this creates an overflow of data. If this happens, it needs to be written to a disk. In this situation, it’s essential that the data be encrypted using strong encryption and then removed once the migration has been completed.
Data at rest (as opposed to data in transit) creates a window of opportunity for theft and other compliance issues. That’s why this window needs to be as short as possible. If your data comes to a rest during migration, it needs to be cleared immediately after it reaches its destination, shortening the window of resting time.
How does the vendor handle credential security?
Migration credentials are the keys to the kingdom. Whatever vendor has access to those credentials also has privileged access to the source and destination of a data migration. So the utmost care must be taken to secure these credentials.
The best practice is for the vendor to encrypt the credentials whenever storing them. Be sure to ask potential vendors about their own protocols. On the client side, you should always create accounts separate from normal administrator accounts for the vendor. That way, when the migration is complete, you can immediately disable the vendor accounts.
What happens to my data after migration?
Even after the migration is complete, there are still windows of opportunity for data insecurity. Knowing how a vendor handles your data post-migration is important. First, they should clear all data once the migration is successful. Deleting it isn’t good enough.
Any copies of your data should be destroyed, as should the operating environment in which that data was processed. Anything that leaves a memory footprint provides the opportunity for someone to bring your data back to life.
The cloud is a powerful tool, especially with the rise of remote working. But it’s important to do your due diligence before migrating to the cloud. Look at the vendor’s reputation in the industry. Ask to audit the migration logs to see exactly what was done, who was contacted during migration and what calls were made. That way you can see if anything out of the ordinary pops up. With issues of data sovereignty, it’s better to be proactive rather than reactive if issues arise.
Finally, look for vendors that are established and tested, not those who slid into the market after seeing an opportunity. When it comes to data sovereignty, you want to make sure that your organization and its data are in the hands of a vendor you can trust.
About the author
Rusty Chapin is the manager of DevOps at BitTitan, where he leads a team of engineers to deliver first-class cloud solutions to MSPs and IT professionals. His areas of expertise include database development, SQL server clustering, large-scale SQL server deployment, datacenter operations, IT management and executive mentoring, monitoring system design, and process analysis. | https://digitizingpolaris.com/data-sovereignty-is-complex-ask-these-five-questions-before-moving-your-data-to-the-cloud-b28dcc18d3fa | ['Virginia Backaitis'] | 2020-12-18 05:37:41.764000+00:00 | ['Cloud Migration', 'Information Technology', 'Cloud Computing', 'Digital Transformation', 'Information Management'] |
1 | Esports vs “Real” Sports | Esports vs “Real” Sports
I’ve been an esports player and fan since Starcraft LAN parties in the early 2000’s, and I couldn’t be more excited about where esports are going. Between Blizzard’s OverWatch League announcement and the acquisition of Team Liquid by Monumental Sports, it’s clear that things are changing. Players can now expect sponsorships and salaries. Sports and esports practices are starting to converge.
But some huge, fundamental differences exist between esports and sports that cannot be ignored by anyone more than recreationally interested in the space. So, here’s my list of 10 things that differentiate esports from “real” sports for those of you that still prefer soccer to Starcraft.
1. The core audience is niche.
Traditional sports are mainstream. Esports were created for and by a subculture. It’s not the same culture that created football, baseball, and hockey. The esport audience (for now) somewhat mirrors the demographics and characteristics of gamers and game developers.
But, you might be surprised. Compared to the general population and to all gamers, esports fans are:
More likely to be a millennial Wealthier, and more likely employed More likely to be male
But wait, wasn’t I supposed to say esports fans are more introverted, weirder, or geekier?
I’ll let this very normal picture from an Oakland Raiders Game speak for itself.
2. There are no referees.
In traditional sports, you have to remember the rules and play by them: if you touch the soccer ball with your hands, the other team gets a penalty kick.
In esports, the rules are embedded in gameplay. You physically cannot touch the soccer ball with your hands. The game won’t let you. Since it’s impossible to break the rules, you don’t need a referee watching for in-game violations.
In esports, the rules are embedded in gameplay.
3. The rules change.
Every few months, game studios update the rules. Sometimes this is to make things fairer, sometimes just to keep the game interesting.
If hockey rules worked like esports, it would look something like this:
— Season 1: Ice now has less friction.
— Season 2: All goalies move 20% slower.
— Season 3: Goalies can now use two sticks.
This keeps the game new. It also makes esports more difficult to understand than traditional sports.
4. The games are more complex.
Because rules are programmed directly into the game and change regularly, it can take years to fully understand a Multiplayer Online Battle Arena (MOBA) like League of Legends or a Real Time Strategy Game (RTS) like Starcraft II.
This will likely change over time. As more money gets involved and esports cater to a broader audience, games are already getting better at visual cues and statistics to help casual fans understand what’s happening.
5. The games change.
New games come out, old games die. This is great for the core esport audience since it keeps things fresh. But it also makes it hard to accomplish structured objectives like building arenas and investing in branding, merchandise, and leagues.
Popular esports in 2012:
DoTA 1, Counter-Strike, Starcraft II, League of Legends
Popular esports in 2016:
Counter-Strike GO, Overwatch, Hearthstone, League of Legends, DOTA 2
6. There are many, many, many leagues.
Anybody can make a league, which means there are a lot of them. Here’s a current example on DotaBuff of just how many leagues are out there.)
In sports, there is usually one big league, maybe a few minor leagues, and a few large tournaments (football has the NFL and Superbowl). But in esports, there can be several “official” leagues, an unlimited amount of tournaments and events, and same teams can play in all of them.
The Riot Games World Championship at Sangam. [Riot Games]
Anyone with enough money and motivation could start a league, put up a large cash prize, and (with effective management and promotion) have little shortage of interested participants. If traditional sports worked like this, anybody with the resources to do so could essentially host their own Superbowl.
7. It’s global.
Esports live on the internet, so you can watch or compete from anywhere. That gives nearly every tournament an amazing, Olympic-like tension. (Go USA!)
You could even argue that esports are fundamentally intergalactic.
Live long and get rekt, noob. [Life at UofT]
8. Viewership is less dedicated.
While traditional sports are broadcast through television, esports usually reach their viewers through live streaming.
It’s more difficult to hold a dedicated audience (and reach them with ads) through streaming than TV. When there’s so much content available on the internet, it’s easier to switch your attention elsewhere.
As an example, this was my weekend:
Television: BlizzCon Hearthstone Championship
iPad: ESL One Frankfurt 2016 Qualifiers
Computer: DoTA Personalities streaming on Twitch
Do you think I paid attention to any advertisements at all? Nope. I just turned my attention to one of the other three screens I was watching. And that brings me to…
9. Advertising and sponsorship is harder.
Since internet fame is not gated by brands or media, authenticity, intimacy, and content matter more than institutional support. Star players won’t necessarily give up trolling people on Twitch to earn endorsements — if they do, they can lose their fanbases.
Moreover, esports fame is ephemeral. Players can be good one year and completely irrelevant the next due to gameplay changes. Coaches leave to attend college. Team names and ownership shift dramatically and often.
All of this makes traditional sponsorship and team management difficult.
10. Esports teams are structured differently than sports teams.
The players are transient, the games change, the rules evolve, and there are a lot of leagues. Viewership and income sources are fragmented and decentralized. As a result, big esports teams look and act more like brand franchises than monolithic businesses.
Most institutional team brands like Team Liquid and Cloud 9 have multiple teams for different esport games, including Dota 2, League of legends, Starcraft II, and CS:GO.
This sounds confusing, but everything on the ground — what sports are hot, which teams are good, which teams exist — can change in the space of a month. This construct allows the brand to stay above the volatility in the trenches: Team Fnatic endures, even as its individual esports teams come and go. | https://medium.com/@cmfoil/esports-vs-real-sports-b8e6db1ff793 | ['Cheryl Foil'] | 2017-05-31 15:39:00.012000+00:00 | ['Esports', 'Gaming', 'League of Legends', 'Overwatch', 'Technology'] |
2 | Should Your Company Shift to a Hybrid Workforce? Look to the Cloud for the Answer. | Should Your Company Shift to a Hybrid Workforce? Look to the Cloud for the Answer. Steven Perlman Apr 27·4 min read
Think back five or ten years to when your company first started considering the idea of moving to the cloud. Likely, a few people wanted to move entirely to the cloud, a few were entertaining the idea of a hybrid model, and even more thought the idea of a complete migration was ridiculous. Where are all of those people now?
Likely, it would be unfathomable to imagine your company without the use of the cloud in your day to day life, especially now that the modern workforce is in a transitional period.
Think about now how you and your colleagues think about a remote work environment. Likely, you have a similar dynamic. Some of you are ready to work from home full time, others think a full return to the office is the only way, and still others are pushing for a hybrid model.
Onsite = On-Prem | Remote = Cloud
Years ago, the companies who were early adopters implemented a cloud-forward strategy and were considered the risk-takers. They were the ones everyone waited to see if would fail, and now they’re considered the revolutionaries. The only way to access a good internet connection was to log onto the big, bulky desktop at your workstation in the office (I’ll bet you can still hear the dial up tones if you try). No one would have ever imagined you’d be able to work in that same spot without having eyes on the physical place where your data is being stored securely — remember the big server room (remember paying rent for a big room for your data)? It’s now been rendered all but obsolete.
We are hinging on the same type of transition when it comes to a remote workforce. The onset of the COVID-19 pandemic radically changed the way that people were able to work. There was no choice but to send employees to their homes to work, and while everyone originally had thought it a temporary fix, over a year later, it seems like it may not be so much.
Hybrid Workforce = Hybrid Cloud. It’s a good option, but it’s better used as a transition.
There were absolutely more conveniences to having some of your data in the cloud as opposed to none of it. As you started utilizing the cloud, you likely had some info on there and the rest on-premise, then slowly transitioned until you were totally virtual.
You’re likely having a similar experience now with the remote workforce. Maybe you have a rotation going of who comes into the office every two or three days, or everyone meets once a month for a certain meeting. There’s probably talk of when it will be safe to return to the office full time, and some employees would rather be back as soon as possible while others would prefer to stay virtual.
This is where the cloud vs. on-prem metaphor returns. Making a move to an entirely remote workforce is the same choice that your predecessors had to make about moving entirely to the cloud. A daunting concept at first, but now looking back, it’s hard to believe anyone was ever against it. In five or ten years from now, it will be hard to believe anyone ever stood against a remote workforce.
There are many benefits to a remote workforce that haven’t even been realized yet, the same as it was with moving to the cloud.
For example, when people made the choice to move to the cloud a decade ago, how could they have known it would be vital to allowing employees to work from home during a deadly global pandemic? Talk about an unforeseen benefit!
More immediately, think about the money you’ll save as a company by going fully remote. You won’t have to pay rent, utilities, insurance, or anything else that you would normally pay on a physical space. You’ll be able to hire #TopTechTalent from anywhere you want to — there’s an entire untapped market of technology professionals you can choose from when building out a team.
Your reputation as a Best Place to Work is also likely to skyrocket. Remote work capabilities are a huge benefit for companies, and it will definitely give you a leg up over your competition. You don’t want to miss this chance to be at the front of an evolving workforce.
Steve Jobs once said, “Innovation distinguishes between a leader and a follower.” Which one are you going to be remembered as? | https://medium.com/@syftersteve/should-your-company-shift-to-a-hybrid-workforce-look-to-the-cloud-for-the-answer-ee1b00f7bc77 | ['Steven Perlman'] | 2021-04-27 15:54:11.739000+00:00 | ['Remote', 'Technology', 'Best Practices', 'Cloud', 'Hybrid Cloud'] |
3 | Efficient energy storage and future PVS by Solar DAO | The solar power station is a specially equipped system for converting sunlight into electricity, which is powered by solar modules. At the moment, Solar Energy is one of the fastest growing types of renewable energy.
Solar DAO — a tokenized fund designed for everyone to easily participate in PV solar plants construction across the globe. SDAO tokens on: YoBit, Bitafex, Idex.
Solar module + inverter = electricity
The main problem of any renewable energy source is the irregular generation of electricity. Compared with the usual non-renewable sources, peaks of generation and consumption of electricity by users do not match each other. So, wind station works only at a wind, and the maximum power output of PVS is reached in daytime and sunny weather.
Using power storage can solve this problem.
It is important to note that efficient storage is not only storage devices (batteries), but also technical solution, controllers and software.
The main types of drives used are:
Acid batteries . Huge in size devices with a short life period and significant losses in conversion and charging.
. Huge in size devices with a short life period and significant losses in conversion and charging. Lithium batteries. A fairly effective option, but still has a huge cost and danger due to heating.
A fairly effective option, but still has a huge cost and danger due to heating. Mechanical storage. Work on the principle of raising and lowering the load. Used rarely because of the huge energy loss.
Work on the principle of raising and lowering the load. Used rarely because of the huge energy loss. Hydrokinetic storage . Electricity is sent to pumps that move water, and then accumulated water potential is used to operate the turbines. The main disatvantage — huge size of buildings.
. Electricity is sent to pumps that move water, and then accumulated water potential is used to operate the turbines. The main disatvantage — huge size of buildings. Hydrogen storage. Hydrogen is produced by electrolysis. After it can be converted back to the electricity and heat. The high cost, explosiveness and fire hazard don't allow this method to spread.
Hybrid solutions are also used, which also have their pros and cons.
The main problem of experienced industrial storages — high cost and low capacity, there is practically no technology that is economically profitable. Existing batteries are too expensive and have low efficiency. It means that batteries are more expensive than the PVS themselves.
Last year, SDG&E showed the world’s largest lithium-ion battery energy storage facility. 30 MW energy storage is capable of storing up to 120 MW hours of energy.
The $1 billion solar farm for Riverland is expected to be built in 2018. Riverland Solar Storage’s 330MW solar generation and 100MW battery storage system will be the biggest one for Australia.
And what about Tesla?
For reference: Nowadays Tesla provides two types of power storage systems — Powerwall and Powerpack. The first can be used in residential and small office space, the second — to work in enterprises, cost $ 250 per 1 kW.
Solar DAO and the best solution
The Solar DAO team has been engaged in research and choosing the most profitable engineering solutions for many years. We also come very carefully to the issue of power storage.
Together with our partners, we plan to construct a project with a capacity of about 40 MW in Turkey.
Our partners will help to build the PVS by selecting the appropriate kinetic storages and providing the appropriate support:
Nord Systems . Providing development potential, specialized software and information display systems.
. Providing development potential, specialized software and information display systems. Atlant Energy . Innovative equipment, kinetic energy storage.
. Innovative equipment, kinetic energy storage. Natpromlizing . Financing projects, supporting modernization.
. Financing projects, supporting modernization. Sberenergodevelopment. Full support of the project, selection of technical solutions, financing, assistance in implementation.
After the first successful project in Turkey we plan to do several pilot projects in India and expand this experience to Israel, Germany, Malta, Cyprus. | https://medium.com/solardao/efficient-energy-storage-and-future-pvs-by-solar-dao-14a9638558d1 | ['Solar Dao'] | 2019-01-28 12:03:58.427000+00:00 | ['Solar Energy', 'Blockchain', 'Technology', 'Solardao'] |
4 | How do you hold a climate-friendly conference? Log in. | How do you hold a climate-friendly conference? Log in.
Air travel is a huge source of carbon emissions — and part of the job in academia. Does it have to be?
Illustration by Francesco Zorzi
By Schuyler Velasco
Air travel is a huge source of carbon emissions — and part of the job in academia. Does it have to be?
Stephen Allen hasn’t taken a flight for work since 2006.
The management lecturer at the University of Sheffield studies ways that large organizations can operate more sustainably. And he decided early in his career that he needed to embody what his research recommends.
“I’ve taken a fairly draconian approach,” Allen says. Last year, he refused to fly from his home in the United Kingdom to a conference in Croatia. “Once you start looking at the carbon footprint calculators, you think, my God, me cycling and walking to work or whatever, it’s wiped out the moment I get on an airplane.”
Frequent flying, like eating meat and driving a gas-guzzling SUV, is a choice facing ramped-up scrutiny in an era of climate change. In January, the annual World Economic Forum drew some side-eye when a record number of private jets landed in Davos for the event — even as organizers touted global warming as a key topic of conversation. Climate activist Greta Thunberg refuses to fly; her activism has helped kick-start a “flight-shaming” movement worldwide and especially in her home country, Sweden.
Many professionals are grappling with how to adapt their work habits to a warming planet, and the stakes feel even higher in higher education. Flying to research sites and academic conferences has long been an essential part of a professor’s job. But many climate and sustainability researchers, who spend their careers thinking about ways to lessen humanity’s environmental impact, feel uncomfortable with the trappings of a typical conference: a round-trip flight, a days-long stay in an energy-guzzling chain hotel, wasting paper with all those programs and name badges.
Still, the pressure can be high to take part in industry confabs, says Jennie Stephens, a professor of sustainability science and policy at Northeastern University. “Particularly within some disciplines,” Stephens says, there’s a feeling that “you really need to show up.”
Or do you? Stephens and Allen are among the academics trying to give the traditional conference model a climate-friendly makeover — and convince their colleagues to find other ways to gather the world’s experts in a given field.
“There’s not getting to know people over a coffee or a beer, but there are other upsides.” — Stephen Allen, a management lecturer at the University of Sheffield
The Tyndall Centre, a collaboration of climate change researchers from the UK and China, has created a decision tree to help its members make more climate-efficient travel choices. A case study of air travel at the University of British Columbia recommends a host of adjustments, from encouraging employees to travel economy-class to incorporating an emissions tracker into the university’s financial management system.
And this past November, when Allen and colleagues at the University of Sheffield held a symposium on reducing academic travel, they did it virtually, tapping speakers on three continents and hosting 100 participants from 19 countries. In June, Northeastern is teaming up with the KTH Royal Institute of Technology in Sweden to hold a conference on sustainable consumption that has two hubs — one in Boston, one in Stockholm. The goal is to allow both European and North American participants to attend, with little or no flying involved.
“Culturally, a lot of what we think we need [is] not very sustainable,” says Stephens, a co-chair of the Northeastern event. “So we are trying to be innovative and the travel component is a part of that.”
Air travel makes up roughly 2 percent of global carbon emissions — a small but rapidly growing share — and flying is one of the most consequential choices a single person can make about their carbon footprint. But most air travel is done by a small group of frequent flyers, many of whom are traveling for business. In the UK, 15 percent of the population took 70 percent of the nation’s flights in 2013, according to a government survey.
Academia’s contribution to that is hard to quantify, but some figures offer clues. The University of British Columbia estimates that the carbon impact of university-related air travel is equivalent to about two-thirds of the annual impact from operating its main campus. Allen says that around 20 percent of a typical research-intensive university’s carbon footprint comes from flying.
“Some people’s academic identities are totally bound up in traveling,” Allen says. “That’s how they understand their jobs. That’s how they understand being successful.”
Still, when Northeastern began planning to host this summer’s Sustainable Consumption Research and Action Initiative (SCORAI) conference, the transatlantic flights it would’ve required of many Europe-based researchers seemed discordant. After all, the theme of this year’s gathering is Sustainable Consumption & Social Justice in an Urbanizing World, and it will include round-table discussions on “Lifestyles, morality, and the climate crisis” and “Flight-free vacation practices,” to name a few.
“We reached out a hand to the Boston team and said, ‘We think it would be nice if we could gather in Northern Europe and skip flying,” says Daniel Vare, a researcher at KTH and the project leader for the Stockholm hub, which will host about 100 people.
The two teams created a compromise between a classic conference and a virtual event. Keynote speakers will be split between Boston and Stockholm, and talks will be live-streamed. The schedule will run during the workday in Boston and into the evening in Stockholm, to bridge the six-hour time difference. Each site will hold parallel breakout sessions. Organizers are keen to avoid a situation where one hub or another feels like the “main” location. Networking events, which Stephens still believes are a crucial reason to hold conferences, will take place at both locations. “We hope that this could be the recipe for a more in-between version, where you get the social interaction, but still lower emissions and flight miles,” Vare says.
SCORAI’s menus will be plant-based and locally-sourced. The 150 or so Boston participants will have the option to stay in campus dorms (which generally operate more efficiently than hotels) and will be encouraged to take public transit. The agenda in Stockholm includes a train trip to ReTuna, a Swedish mall where everything on sale is recycled.
Putting on more sustainable events like SCORAI and encouraging remote interaction has plenty of practical upside beyond the climate, Stephens points out. “It’s cheaper for the organization, and it’s more time-efficient not to have to travel as much,” she says.
Plus, it can make for easier logistics. The virtual nature of the University of Sheffield’s November symposium allowed Allen and his team to book speakers who might not have made it otherwise. One gave a keynote from Sweden, then spoke at another virtual conference based in Spain the same day. Participants took advantage of other opportunities to connect, chatting on an online platform and in virtual breakout sessions.
“There’s not getting to know people over a coffee or a beer, but there are other upsides,” Allen says.
Still, the traditional conference model has endured for a reason. At the Sheffield symposium, one presentation raised the possibility that anti-flying measures could disproportionately impact early-career academics, who might feel pressure to turn down can’t-miss career opportunities. Even among climate researchers, there isn’t total agreement on ramping down air travel. Individual flights are still a drop in the ocean of global carbon emissions, and some see getting the message out and fundraising as more crucial than giving up flying.
“It’s quite a controversial topic,” Allen says. “It’s still a marginal community of people keenly trying to even think about these questions.”
But Vare, in Stockholm, thinks that even if conferences remain mostly the same, new models for remote events could force people to reconsider their regular work travel. He says he’s already seen smaller meetings and research collaborations become virtual more often in just the past few years. He hopes a climate-friendly academic conference could offer inspiration for conference organizers in other industries, leading to more semi-virtual events.
“We’re modeling for our students, for our partners, for other nonacademic community members,” Stephens says. “So we should be leaders.” | https://medium.com/swlh/how-do-you-hold-a-climate-friendly-conference-log-in-d36576bc0f30 | ['Experience Magazine'] | 2020-03-01 11:03:33.972000+00:00 | ['Technology', 'Climate Action', 'Higher Education', 'Telecommunication', 'Academia'] |
5 | Guide to Material Motion in After Effects | I’ve already shared why Motion Design Doesn’t Have to be Hard, but I wanted to make it even easier for designers to use the Material motion principles I know and love. After Effects is the primary tool our team uses to create motion examples for the Material guidelines. Having used it to animate my fair share of UIs, I wanted to share my workflow tips and…
My After Effects sticker sheet
Download this basic sticker sheet to see a project completed using my streamlined workflow (outlined below). It contains a collection of Material components, baseline UIs, and navigation transitions.
Download it here 👈
Available under Apache 2.0. By downloading this file, you agree to the Google Terms of Service. The Google Privacy Policy describes how data is handled in this service.
Importing assets into AE
First things first, we need assets to animate. Most of the visual designers on our team use Sketch, which by default doesn’t interface with AE. Thankfully Adam Plouff has created this plugin that adds this functionality. I used it to import our library of baseline Material components from Sketch into AE. These assets are found in the sticker sheet’s Components folder.
Creating UIs
With this library of baseline components, new UIs can quickly be assembled by dragging them into a new AE comp. | https://medium.com/google-design/guide-to-material-motion-in-after-effects-9316ff0c0da4 | ['Jonas Naimark'] | 2019-05-22 14:32:47.463000+00:00 | ['Animation', 'Technology', 'Visual Design', 'Material Design', 'Design'] |
6 | Foursquare Predicts Chipotle’s Q1 Sales Down Nearly 30%; Foot Traffic Reveals the Start of a Mixed Recovery | When Chipotle came on the scene, the chain earned lots of fans for its approach to “food with integrity,” including antibiotic-free meats, GMO-free ingredients, and fresh local produce. However, the last six months have been a tumultuous ride. Since the first E. coli reports emerged in October 2015, reports popped up across the country raising skepticism about its products and processes, and Chipotle has been racing to squash the issues, institute better training and manage its reputation. The fast casual Mexican-themed chain is still dealing with the repercussions, and an even more recent norovirus outbreak in March at two stores.
In February, the CDC gave the chain a clean bill of health. To take a deeper look at how the downturn and recovery has gone, we analyzed the foot traffic patterns at the more than 1,900 Chipotle US locations and compared them to the previous year. At Foursquare, we have a trove of anonymous and aggregate data on where people go, based on the 50 million people who use our apps (Foursquare and Swarm) and websites monthly. Many users passively share their background location with us, which our technology can match up with our location database of over 85 million places, giving us clear insight into natural foot traffic patterns. (Here’s a video that shows how Foursquare maps a large Chipotle location in downtown Portland, Oregon.)
A Look Back
Foot traffic to Chipotle started to follow the same directionally downward seasonal winter traffic trend in 2015 as in 2014. But as time went on, it became clear that 2015 was no ordinary winter for Chipotle; traffic was down in a more significant way.
The chart below shows the share of visits to Chipotle restaurants in comparison to visits to ALL restaurants in the United States. In the 2015–2016 winter, visits to Chipotle restaurants declined more significantly than in 2014–2015.
Visit share began to recover in February 2016, marked by the CDC’s conclusion of its E. Coli investigation and Chipotle’s ‘raincheck’ promotion launch, ostensibly for customers who were unable to satisfy their burrito cravings during the company’s system-wide closure on February 8. Foot traffic took another dip, albeit much smaller, following the more minor norovirus outbreak in Boston in two locations in early March.
Sales Projections
Chipotle has publicly reported its weekly sales for the first 10 weeks of Q1, giving us ample data to build statistical models to project sales for the rest of the quarter. Taking into account reported sales, redeemed coupons and other factors, along with Foursquare foot traffic data, we estimate that Chipotle ended Q1 2016 with same store sales down roughly 30% year-over-year (which we expect to be confirmed by Chipotle when it reports earnings on April 26). Foot traffic estimates, however, tell a brighter story. Foursquare data shows that year-over-year, Q1 same store traffic declined only about 23%. The gap between sales and foot traffic is likely a result of all the free burrito coupons that were redeemed, which lured in people, though not revenue.
We believe the 23% decline in same store foot traffic is the more meaningful number that shareholders should focus on, rather than the 30% decline in sales. It shows that Chipotle is building trust back with customers, which is more important to its success long-term. Although it sacrifices revenue this quarter by giving product away, it is proving to be a winning strategy for getting people comfortable with coming back. The trick is in making sure that these customers come back again and spend money in the future.
“The problem is that Chipotle’s customers are already so darn loyal.”
— Chipotle CFO, Jack Hartung (during a July 21, 2015 earnings call to analysts)
Chipotle Needs to Focus on Loyalty
We looked at how frequently customers went to Chipotle over the past year and found some interesting insights. Last summer, just 20% of Chipotle customers made up about 50% of foot traffic visits. Because this cohort of loyal customers reliably returned to Chipotle month after month, they contributed to an outsized percentage of foot traffic, and likely sales. Interestingly, it’s this group of faithful customers that have changed their Chipotle eating habits most dramatically: these once-reliable visitors were actually 50% more likely to STAY AWAY in the fall during the outbreak, and they have been even harder to lure back in. While those who infrequently visited Chipotle last summer have returned to Chipotle at similar rates as before, the formerly loyal customers have been 25% less likely to return. The loss of these important customers is what has really hurt Chipotle, since losing 2–3 loyal customers is the equivalent of losing about 10 other customers.
To demonstrate that this is an unusual loss of loyalty, versus natural attrition, we compared this pattern with a cohort of frequent Panera goers. The chart below illustrates that while both chains experienced a similar seasonal dip, Chipotle has lost much more traffic from its loyalists than Panera has.
Chipotle has famously dismissed the idea of having a loyalty program, stating that it didn’t believe that loyalty programs help turn infrequent goers into loyal visitors. According to Chipotle CFO Jack Hartung, “The problem is that Chipotle’s customers are already so darn loyal.” Looks like it’s time to reconsider those famous last words.
So, where are they headed instead? Foursquare foot traffic data reveals that they have replaced their usual Chipotle visits with visits to other popular chains such as McDonald’s and Starbucks. They have also been slightly more likely than the average person to visit Whole Foods, for which there are some naturally overlapping qualities in offerings that have more integrity and healthfulness.
Looking to the Future
Two weeks from today, Chipotle will share its official Q1 earnings. We, alongside most analysts, anticipate the bitter pill the restaurant chain will have to swallow as they report on losses. But we also see a more unique, nuanced and slightly rosier picture, and urge Chipotle to continue building brand loyalty — one burrito at a time.
###
Foursquare’s Location Intelligence Paves the Path To Recovery
The data looks promising for recovery, but there will be trouble for Chipotle if they don’t lure back the formerly loyal visitors or nurture a new group of faithful fans.
Some ideas for how to do this effectively:
Analyze foot traffic patterns by using Place Insights powered by Foursquare.
Target advertising at the formerly frequent Chipotle visitors who have the power to bring back larger percentages of sales by using Pinpoint powered by Foursquare.
Measure all of its digital marketing initiatives so they can double down on the programs that are most effective at turning visitors into loyalists; they can do this through Attribution powered by Foursquare.
When you’re operating a brick-and-mortar business, location intelligence is critical. Chipotle’s burrito-based bottom line proves it.
Interested in any of the analysis or tools mentioned above? Do you need the power of location intelligence? Read more about our enterprise solutions or contact us.
###
Notes on Methodology | https://medium.com/foursquare-direct/foursquare-predicts-chipotle-s-q1-sales-down-nearly-30-foot-traffic-reveals-the-start-of-a-mixed-78515b2389af | ['Jeff Glueck'] | 2018-08-01 12:06:28.012000+00:00 | ['Technology', 'Foursquare', 'Finance', 'Data Science', 'Chipotle'] |
7 | US DOT Regulatory Moves Foreshadow Big Changes in 2020 | The Federal Motor Carrier Safety Administration (FMCSA) and the National Highway Traffic Safety Administration (NHTSA) posted important regulatory documents with comment deadlines this summer. Both documents are Advance Notices of Proposed Rulemaking (ANPRM). The ANPRMs not only signal the start of a more active regulatory phase for US DOT, they will now require US DOT to respond directly to comments and thus the ANPRMs will build a formal record that has the force of administrative law behind it.
The Basics
· The comment deadline for the NHTSA notice is July 29th.
· The comment deadline for the FMCSA notice was extended to August 26th.
· The NHTSA notice is the starting the process to update Federal Motor Vehicle Safety Standards for Level 4 and Level 5 AVs, in particular for unconventional design (i.e. AVs without manual controls).
· The FMCSA notice begins the process for updating its regulations for operation, testing and inspection. FMCSA will only apply new regulations to Level 4 and Level 5. As Levels 1–3 require the presence of a human operation, FMCSA states explicitly it is not intending to change regulations for Level 1–3 CMVs.
· NHTSA indicates that changes to FMVSS Series 100 and 200 that accommodate AVs will be formalized. The process starts with the current ANPRM.
Why Is an Advance Notice Important?
An Advance Notice is the first concrete step in promulgating new regulations. Federal agencies must respond to germane comments on the record. Until now, US DOT has only published Requests for Comment and policy documents (i.e. Automated Vehicles 3.0), which is an information-gathering exercise. Important, to be sure, but RFCs do not have any legal force and federal agencies are not required to respond to comments. ANPRMs are followed by a formal Notice of Proposed Rulemaking (NPRM), which is the actual proposed regulation and will include on-the-record responses to comments submitted for the ANPRM.
The current ANPRMs are the most important regulatory documents released to date for the regulation of automated vehicles.
What Is and Is Not Covered?
The ANPRMs only cover Level 4 and Level 5 vehicles. No regulatory changes are proposed for Levels 1–3. FMCSA is broadly considering issues related to operation (including CDL endorsements and Hours of Service), testing and inspection. Roadside inspections and cybersecurity issues are part of the ANPRM.
The NHTSA ANPRM is the starting document for updating FMVSS Series 100 and 200. NHTSA wants to eliminate requirements that are not applicable to AVs. As importantly, NHTSA seeks assistance in developing new safety testing methods. Six different options are presented for comment: A) Normal ADS-DV Operation; B) TMPE (Test-Mode with Pre-programmed Execution); C) TMEC (Test-mode with External Control); D) Simulation; E) Technical Documentation; and F) Surrogate Vehicle with Human Controls.
Where Does US DOT Go from Here?
Both ANPRMs are the starting points for updating current regulations that will accommodate the unique characteristics of AV. Both FMCSA and NHTSA will need to review all comments and craft formal responses that are consistent with the regulations it intends to adopt. Neither agency faces a legal deadline to respond and release the follow-up NPRM, however, US DOT does want to get new rules in place to provide certainty for the industry. In the end, US DOT wants to promote and grow AV.
Critically, the NHTSA ANPRM indicates that a team at Virginia Tech is drafting new FMVSS technical language and such language for 30 FMVSSs is due by the end of the calendar year. Reading between the lines, it appears that these 30 FMVSSs will be part of a Notice of Proposed Rulemaking that NHTSA will want to release in early 2020. | https://medium.com/@KNaughton711/us-dot-regulatory-moves-foreshadow-big-changes-in-2020-91013ca9d771 | ['Keith Naughton'] | 2019-07-15 13:42:39.342000+00:00 | ['Administrative Law', 'Technology News', 'Self Driving Cars', 'Regulation', 'Transportation'] |
8 | An Illustrated Guide to Bi-Directional Attention Flow (BiDAF) | The year 2016 saw the publication of BiDAF by a team at University of Washington. BiDAF handily beat the best Q&A models at that time and for several weeks topped the leaderboard of the Stanford Question and Answering Dataset (SQuAD), arguably the most well-known Q&A dataset. Although BiDAF’s performance has since been surpassed, the model remains influential in the Q&A domain. The technical innovation of BiDAF inspired the subsequent development of competing models such as ELMo and BERT, by which BiDAF was eventually dethroned.
When I first read the original BiDAF paper, I was rather overwhelmed by how seemingly complex it was.
BiDAF exhibits a modular architecture — think of it as a composite structure made out of lego blocks with the blocks being “standard” NLP elements such as GloVe, CNN, LSTM and attention. The problem with understanding BiDAF is that there are just so many of these blocks to learn about and the ways they are combined can seem rather “hacky” at times. This complexity, coupled with the rather convoluted notations used in the original paper, serves as a barrier to understanding the model.
In this article series, I will deconstruct how BiDAF is assembled and describe each component of BiDAF in (hopefully) an easy-to-digest manner. Copious amount of pictures and diagrams will be provided to illustrate how these components fit together.
Here is the plan:
Part 1 (this article) will provide an overview of BiDAF.
Part 2 will talk about the embedding layers
Part 3 will talk about the attention layers
Part 4 will talk about the modeling and output layers. It will also include a recap of the whole BiDAF architecture presented in a very easy language. If you aren’t technically inclined, I recommend you to simply jump to part 4.
BiDAF vis-à-vis Other Q&A Models
Before delving deeper into BiDAF, let’s first position it within the broader landscape of Q&A models. There are several ways with which a Q&A model can be logically classified. Here are some of them:
Open-domain vs closed-domain. An open-domain model has access to a knowledge repository which it will tap on when answering an incoming Query. The famous IBM-Watson is one example. On the other hand, a closed-form model doesn’t rely on pre-existing knowledge; rather, such a model requires a Context to answer a Query. A quick note on terminology here — a “Context” is an accompanying text that contains the information needed to answer the Query, while “Query” is just the formal technical word for question.
vs An open-domain model has access to a knowledge repository which it will tap on when answering an incoming Query. The famous IBM-Watson is one example. On the other hand, a closed-form model doesn’t rely on pre-existing knowledge; rather, such a model requires a Context to answer a Query. A quick note on terminology here — a “Context” is an accompanying text that contains the information needed to answer the Query, while “Query” is just the formal technical word for question. Abstractive vs extractive. An extractive model answers a Query by returning the substring of the Context that is most relevant to the Query. In other words, the answer returned by the model can always be found verbatim within the Context. An abstractive model, on the other hand, goes a step further: it paraphrases this substring to a more human-readable form before returning it as the answer to the Query.
vs An extractive model answers a Query by returning the substring of the Context that is most relevant to the Query. In other words, the answer returned by the model can always be found verbatim within the Context. An abstractive model, on the other hand, goes a step further: it paraphrases this substring to a more human-readable form before returning it as the answer to the Query. Ability to answer non-factoid queries. Factoid Queries are questions whose answers are short factual statements. Most Queries that begin with “who”, “where” and “when” are factoid because they expect concise facts as answers. Non-factoid Queries, simply put, are all questions that are not factoids. The non-factoid camp is very broad and includes questions that require logics and reasoning (e.g. most “why” and “how” questions) and those that involve mathematical calculations, ranking, sorting, etc.
So where does BiDAF fit in within these classification schemes? BiDAF is a closed-domain, extractive Q&A model that can only answer factoid questions. These characteristics imply that BiDAF requires a Context to answer a Query. The Answer that BiDAF returns is always a substring of the provided Context.
An example of Context, Query and Answer. Notice how the Answer can be found verbatim in the Context.
Another quick note: as you may have noticed, I have been capitalizing the words “Context”, “Query” and “Answer”. This is intentional. These terms have both technical and non-technical meaning and the capitalization is my way of indicating that I am using these words in their specialized technical capacities.
With this knowledge at hand, we’re now ready to explore how BiDAF is structured. Let’s dive in!
Overview of BiDAF Structure
BiDAF’s ability to pinpoint the location of the Answer within a Context stems from its layered design. Each of these layers can be thought of as a transformation engine that transforms the vector representation of words; each transformation is accompanied by the inclusion of additional information.
The BiDAF paper describes the model as having 6 layers, but I’d like to think of BiDAF as having 3 parts instead. These 3 parts along with their functions are briefly described below.
1. Embedding Layers
BiDAF has 3 embedding layers whose function is to change the representation of words in the Query and the Context from strings into vectors of numbers.
2. Attention and Modeling Layers
These Query and Context representations then enter the attention and modeling layers. These layers use several matrix operations to fuse the information contained in the Query and the Context. The output of these steps is another representation of the Context that contains information from the Query. This output is referred to in the paper as the “Query-aware Context representation.”
3. Output Layer
The Query-aware Context representation is then passed into the output layer, which will transform it to a bunch of probability values. These probability values will be used to determine where the Answer starts and ends.
A simplified diagram that depicts the BiDAF architecture is provided below: | https://towardsdatascience.com/the-definitive-guide-to-bi-directional-attention-flow-d0e96e9e666b | ['Meraldo Antonio'] | 2019-09-07 13:57:19.763000+00:00 | ['NLP', 'Artificial Intelligence', 'Machine Learning', 'Technology', 'Data Science'] |
9 | 3 retail branding tips to connect with your customers | Creating a connection between your brand and your customers is crucial for retail. So the big question is: How do you ensure that customers stay satisfied and return to purchase more?
As a retailer, you have to be open to change and adapt easily to stay relevant. Be close to your customers, listen to them and fulfil their needs.
But how do you achieve that? Let’s have a look at the 10 fastest growing companies in The Netherlands in 2019. According to the research done by the Erasmus Centre for Entrepreneurship, those companies are:
Picnic Takeaway Rituals Action Young Capital Elastic Coolblue Calco Adyen BasicFit
What makes those companies so successful and why are they a great example of building a strong connection between a brand and its customers?
Customer Experience
They focus on customer experience. How do you ensure that customers have a positive experience during their visit to your shop, or during other interactions with your brand? A positive experience will lead to higher customer retention, and thereby more sales.
Action strongly focuses on surprising their customer. For example, when a customer goes to their store looking for bicycle lights, they often end up buying more than they had anticipated. Action surprises their customers with new products, things that can come in handy, for an attractive price. Action’s in-store experience is designed to increase impulse buying.
Another company that knows a thing or two about customer experience is Rituals. When a customer enters their shop, they are welcomed with a warm cup of tea and friendly shop assistants offering to demonstrate the newest scrubs, or creams. Rituals creates a unique customer experience which is consistent in every store. Rituals’ customers know what to expect. Rituals is attentive and ensures that everyone who walks through the door experiences a moment of relaxation. They are also known to regularly introduce new scrubs, creams and incenses, which ensures that their customers don’t get bored and are curious every time they walk past a Rituals store.
However, the real champion of creating a positive customer experience is Coolblue; ‘Everything for a smile’. Their service desk is open until midnight. Their customers have the opportunity to try the products at home and send them back, in case they are not 100% satisfied. If a customer is looking for comprehensive, personal advice, they are welcome to call the service desk. Alternatively, they could seek advice in one of the stores, where the employees are well-trained to assist customers with all sorts of questions. Customers can share their real Coolblue experience with others on the website of Coolblue which creates brand trust.
Love Brand
Have you ever heard of the term Love Brand? This new term is used quite frequently these days. Love Brand is a company that invests in their relationship with their customers. A Love Brand translates its unique values into their customer experience. It embraces its customers with love and enthusiasm and inspires their customers to share their love for the brand. A Love Brand is a brand that also creates a connection between its customers.
Rituals is a real Love Brand. Rituals has such a strong brand identity, that one feels instantly connected with their brand vision. Their message empowers people to take a moment for themselves, because every little moment contributes to their happiness. When one enters a Rituals store, they are instantly embraced with a feeling of relaxation and calmness. Rituals tries to fully understand their customers’ needs and does a great job at seeing things from their customers’ point of view. They want their customers to take a step back from the daily hustle.
Another example of a strong Love Brand is Dille & Kamille. Dille & Kamille customers prefer shopping in the physical store instead of the website, because it provides a better brand experience. The smell, the surprising range of products, the new seasonal inspiration attracts them to the store. From the moment they walk into the store, they feel connected with the core values of the brand, a feeling of happiness and positivity.
Last, but not least, Marco Aarnink, with his company Print.com, knows how to give his customers the attention they deserve. His team takes the time to talk with their customers, understand them, and relate to them in order to establish a true relationship. Ordering a printed notebook has never been such a personalized experience. Print.com customers feel as a part of the Print.com community.
BE your customer
The fastest growing companies at the moment are Picnic and Takeaway.com. They both have something in common, namely that they do something that a customer would otherwise do themselves. They simplify their customers’ lives by providing a convenient service. Food on-demand, without the need to leave your house. This allows their customers to have more time for themselves.
In short, make sure that your customer experience is consistent throughout, independent whether your customers are seeing an advertisement, passing your store, or being greeted by an employee. Make sure that these experiences are all intertwined, similarly to Coolblue. This will strengthen your brand identity.
Other related articles
PREVIOUS STORY
← Lytho in the top 10 most promising Adobe solutions 2019
NEXT STORY
Attracting franchisee candidates in today’s competitive market → | https://medium.com/marketing-and-branding/3-retail-branding-tips-to-connect-with-your-customers-f686eecdf642 | ['Raul Tiru'] | 2020-12-21 12:21:39.825000+00:00 | ['Retail Marketing', 'Retail Technology', 'Branding', 'Retail', 'Retail Industry'] |
10 | Serious Sam’s Baffling Stadia Exclusivity | Serious Sam’s Baffling Stadia Exclusivity
Google keeps the iconic shooter franchise off consoles into 2021
Serious Sam Collection Stadia screenshot taken by the author.
We’re just a few weeks away from the release of Serious Sam 4, the latest installment in the long-running hardcore PC action shooter franchise. But if you’re a console player hoping to test the game out on your current machine or one of this fall’s new consoles…you’re out of luck until some unspecified date next year.
You see Croteam, the makers of Serious Sam, signed a multi-game exclusivity deal with Google’s Stadia platform this past spring. Serious Sam 4 will not launch on any consoles this month, and will not come out in the lucrative launch window on this November’s new platforms, instead debuting on Steam and Google’s underused and oft-derided streaming platform.
This same deal also brought the Serious Sam Collection to Stadia, which is actually a finished version of Serious Sam Fusion, a beta experiment first launched on Steam that was abandoned over a year ago. Collection bundles all the content from the HD remakes of the first two Serious Sam games alongside Serious Sam 3 into one big awkward uber-game.
Serious Sam Collection Stadia screenshot taken by the author.
Once again, Collection won’t launch on other “consoles” until some unspecified date in the future. That’s a shame, because Stadia is a much worse way to play these games than the PC originals. Serious Sam Fusion included numerous graphics settings and experimental tweaks, all of which are disabled and hidden in the Stadia release.
Serious Sam is also a very fast and twitchy game, requiring precise timing and control for maximum play. It’s one of the most intense and hardcore action franchises ever made and it never slows down. Stadia’s latency performance has improved dramatically since launch, and it’s a testament to the state of the service that the Sam games function at all. But it never manages to feel quite as crisp and fun as playing on your own PC.
At least Stadia Pro subscribers were able to snap up the Collection as a freebie at its launch a few months ago, but otherwise Stadia users derive no obvious benefit out of this deal. The Sam games, while always gorgeous, are also so well optimized that they also don’t necessarily need the beefy hardware back-end that Stadia provides. I have no doubt that the new game will run well on mid-range PC’s and even the current base level consoles whenever it finally launches on those machines.
Serious Sam Collection Stadia screenshot taken by the author.
Signing this deal means Croteam is missing out on a huge opportunity this fall. The launch lineups of the two upcoming consoles are sparse, and both Serious Sam 4 and Collection would have been perfect “second games” for people to play when the launch hype wears off and they’re antsy for new content. I think Collection would also be perfectly at home on Switch.
Instead of gaining great benefit from the marketing and hype surrounding these new machines, not to mention Nvidia’s upcoming 3000 series GPU launch, the Serious Sam franchise will instead spend the next several months under-performing on Stadia. Google hasn’t done enough to secure a large library for their fledgling platform, and their marketing budget for the service is essentially zero right now, outside of a small dedicated group on Reddit who will tell you it’s the best “console” they’ve ever used.
I do think Stadia has a lot of potential. I think that Serious Sam is an interesting demo of both the benefits and the pitfalls of the service. But this iconic twitch-based action franchise deserves to launch platforms that can actually provide a fast lag-free experience, and on platforms that will actually market their games. Hopefully, Serious Sam 4 will do well enough on Steam that Croteam can afford a big push when the game finally comes to consoles next year. | https://xander51.medium.com/serious-sams-baffling-stadia-exclusivity-adf70d56cb16 | ['Alex Rowe'] | 2020-11-30 19:23:51.532000+00:00 | ['Gaming', 'Technology', 'Tech', 'Business', 'Google'] |
11 | HNG Internship 5.0 So far | The internship started officially 1st April 2019 and different task are listed for interns to get start.
Task 1
Sign up on GitHub. Create a personal repository. Commit a file. Pull the file. Change stuff. Commit again. Check that you have a profile image set here. If you do not have a blog, start one on blogspot, WordPress, medium or anywhere else you want.
After all is complete, add this icon to your status here: :hatching_chick:
Task 2
Join the #design-stage1 channel. Use Figma to design yourself a home page. Your page must follow a particular format: Your picture on the top left corner.
Links below your picture About, Blog, Projects. Below that are three icons with links to your social media. On the right (separated by a line from the left section), will be some welcoming information for you. Everyone will need to do this. The mentors will give you feedback on your design, however, I will be the final judge. Till your design is accepted, you cannot start the internship. Pay attention to alignment, colors and other important parts of design.
After completing Task 1 and Task 2 you will be promoted to stage one and the deadline 3rd of april.
Stage 1 to Stage 2 Task
Mark Essien announcements
Hello All, The Stage 1 to Stage 2 task will be handled by our chief mentor Seyi Onifade. He will drop the modalities of the task. Going to Stage 2 is *mandatory* for all, even if you are on Design or ML track.
Here are the requirements to enter stage 2:
a) Have a registered and accepted bug found and in the spreadsheet
b) Have a write up on your blog where you link to timbu.com
c) Enter your name on the hng.tech site.
Once all those three are complete, you are eligible for Stage 2 After today, I won’t accept anymore designers or review any more designs. If you want to enter design track, today is the deadline to get a :white_check_mark: on your design. If you have iterated multiple times for the last few days and still getting :x:, please DM your design to @KingDavid and @Mfonobong — after giving feedback, I will have a commitee meeting with them to see who we will still let in. But if you are rejected at committee, then please move on to the dev track.
This internship has been so challenging and amazing Many thanks to Hotels.ng, verifi for making this happen.
Mentors has not being awesome it really not easy to coordinate 3000+ people, kudos to you all.
Thanks | https://medium.com/@devmohy/hng-internship-5-0-so-far-1442e0f9d5ea | ['Mohammed Abiola'] | 2019-04-04 11:41:25.939000+00:00 | ['Figma', 'Hnginternship5', 'Github', 'Technology'] |
12 | Top Vue Packages for Adding Walkthroughs, Making HTTP Requests, and Validating Forms | Photo by Jack Young on Unsplash
Vue.js is an easy to use web app framework that we can use to develop interactive front end apps.
In this article, we’ll look at how the best packages for adding walkthroughs, making HTTP requests, and validating forms.
vue-tour
We can use vue-tour to create walkthroughs for our app.
To use it, we write:
npm i vue-tour
Then we can register the components by writing:
import Vue from "vue";
import App from "./App.vue";
import VueTour from "vue-tour"; require("vue-tour/dist/vue-tour.css");
Vue.use(VueTour); Vue.config.productionTip = false; new Vue({
render: h => h(App)
}).$mount("#app");
We also imported the styles.
Then we can use it by writing:
<template>
<div>
<div id="v-step-0">step 0.</div>
<div class="v-step-1">step 1</div>
<div data-v-step="2">step 2</div>
<v-tour name="myTour" :steps="steps"></v-tour>
</div>
</template>
<script>
export default {
data() {
return {
steps: [
{
target: "#v-step-0",
header: {
title: "get started"
},
content: `hello <strong>world</strong>!`
},
{
target: ".v-step-1",
content: "step 1"
},
{
target: '[data-v-step="2"]',
content: "Step 2",
params: {
placement: "top"
}
}
]
};
},
mounted() {
this.$tours["myTour"].start();
}
};
</script>
We have the list of steps in the template.
Then we have the steps array with the steps that target the elements that are on the page.
This way, the steps are displayed near the target element.
In each step, we can set the header, content, and placement of the steps.
Then to start the tour, we use the start method at the bottom of the page as we did.
vue-resource
vue-resource is an HTTP client that’s made as a Vue plugin.
To install it, we run:
npm i vue-resource
Then we can use it by writing:
import Vue from "vue";
import App from "./App.vue";
const VueResource = require("vue-resource"); Vue.use(VueResource); Vue.config.productionTip = false; new Vue({
render: h => h(App)
}).$mount("#app");
to register the plugin.
Now we have access to the $http property in our component:
<template>
<div>{{data.name}}</div>
</template>
<script>
export default {
data() {
return {
data: {}
};
},
async mounted() {
const { body } = await this.$http.get("https://api.agify.io/?name=michael");
this.data = body;
}
};
</script>
It’s promised based so we can use async and await.
We can set common options that are used throughout the app by writing:
Vue.http.options.root = '/root';
Vue.http.headers.common['Authorization'] = 'Basic YXBpOnBhc3N3b3Jk';
We set the root URL for all requests and the Authorization header that’s used everywhere.
vue-form
vue-form is a form validation library for Vue apps.
We can use it by installing the package:
npm i vue-form
Then we can use it by writing:
main.js
import Vue from "vue";
import App from "./App.vue";
import VueForm from "vue-form"; Vue.use(VueForm); Vue.config.productionTip = false; new Vue({
render: h => h(App)
}).$mount("#app");
App.vue
<div>
<vue-form :state="formState"
<validate tag="label">
<span>Name *</span>
<input v-model="model.name" required name="name"> @submit .prevent="onSubmit"> Name * <field-messages name="name">
<div>Success!</div>
<div slot="required">name is required</div>
</field-messages>
</validate>
</vue-form>
</div>
</template>
<script>
export default {
data() {
return {
model: {
name: ""
},
formState: {}
};
},
methods: {
onSubmit() {
if (this.formState.$invalid) {
return;
}
}
}
};
</script>
We registered the component in main.js .
Then in the component, we used the vue-form component.
The state prop is set to the formState object, which will be updated when the form state change.
submit.prevent has the submit handler with e.preventDefault called on it.
v-model binds to the model.
The field-messages component displays the form validation messages.
required indicates that it’s required.
In the onSubmit method, we check the formState.$invalid to check whether the form is invalid.
If it’s invalid, we don’t proceed with submission.
Other form state properties include $name with the name attribute value of the field.
$valid to see if the form or field is valid.
$focused to check that the form or field is focused.
$dirty to see if the form or field has been manipulated.
It also comes with other validators like email, URL, number, min and max length, and more.
Photo by Heather Lo on Unsplash
Conclusion
We can add a walkthrough tour to our Vue app with the vue-tour package.
vue-resource is a promise-based HTTP client that’s made for Vue apps.
vue-form is a useful form validation plugin for Vue apps.
JavaScript In Plain English
Did you know that we have four publications and a YouTube channel? Find them all at plainenglish.io and subscribe to our YouTube channel! | https://medium.com/javascript-in-plain-english/top-vue-packages-for-adding-walkthroughs-making-http-requests-and-validating-forms-8654072fa756 | ['John Au-Yeung'] | 2020-06-27 08:24:36.825000+00:00 | ['JavaScript', 'Programming', 'Web Development', 'Technology', 'Software Development'] |
13 | Are you using the “Scikit-learn wrapper” in your Keras Deep Learning model? | Are you using the “Scikit-learn wrapper” in your Keras Deep Learning model?
How to use the special wrapper classes from Keras for hyperparameter tuning?
Image created by the author with open-source templates
Introduction
Keras is one of the most popular go-to Python libraries/APIs for beginners and professionals in deep learning. Although it started as a stand-alone project by François Chollet, it has been integrated natively into TensorFlow starting in Version 2.0. Read more about it here.
As the official doc says, it is “an API designed for human beings, not machines” as it “follows best practices for reducing cognitive load”.
Image source: Pixabay
One of the situations, where the cognitive load is sure to increase, is hyperparameter tuning. Although there are so many supporting libraries and frameworks for handling it, for simple grid searches, we can always rely on some built-in goodies in Keras.
In this article, we will quickly look at one such internal tool and examine what we can do with it for hyperparameter tuning and search.
Scikit-learn cross-validation and grid search
Almost every Python machine-learning practitioner is intimately familiar with the Scikit-learn library and its beautiful API with simple methods like fit , get_params , and predict .
The library also offers extremely useful methods for cross-validation, model selection, pipelining, and grid search abilities. If you look around, you will find plenty of examples of using these API methods for classical ML problems. But how to use the same APIs for a deep learning problem that you have encountered?
One of the situations, where the cognitive load is sure to increase, is hyperparameter tuning.
When Keras enmeshes with Scikit-learn
Keras offer a couple of special wrapper classes — both for regression and classification problems — to utilize the full power of these APIs that are native to Scikit-learn.
In this article, let me show you an example of using simple k-fold cross-validation and exhaustive grid search with a Keras classifier model. It utilizes an implementation of the Scikit-learn classifier API for Keras.
The Jupyter notebook demo can be found here in my Github repo.
Start with a model generating function
For this to work properly, we should create a simple function to synthesize and compile a Keras model with some tunable arguments built-in. Here is an example,
Data
For this demo, we are using the popular Pima Indians Diabetes. This dataset is originally from the National Institute of Diabetes and Digestive and Kidney Diseases. The objective of the dataset is to diagnostically predict whether or not a patient has diabetes, based on certain diagnostic measurements included in the dataset. So, it is a binary classification task.
We create features and target vectors — X and Y
and We scale the feature vector using a scaling API from Scikit-learn like MinMaxScaler . We call this X_scaled .
That’s it for data preprocessing. We can pass this X_scaled and Y directly to the special classes, we will build next.
Keras offer a couple of special wrapper classes — both for regression and classification problems — to utilize the full power of these APIs that are native to Scikit-learn.
The KerasClassifier class
This is the special wrapper class from Keras than enmeshes the Scikit-learn classifier API with Keras parametric models. We can pass on various model parameters corresponding to the create_model function, and other hyperparameters like epochs, and batch size to this class.
Here is how we create it,
Note, how we pass on our model creation function as the build_fn argument. This is an example of using a function as a first-class object in Python where you can pass on functions as regular parameters to other classes or functions.
For now, we have fixed the batch size and the number of epochs we want to run our model for because we just want to run cross-validation on this model. Later, we will make these as hyperparameters and do a grid search to find the best combination.
10-fold cross-validation
Building a 10-fold cross-validation estimator is easy with Scikit-learn API. Here is the code. Note how we import the estimators from the model_selection S module of Scikit-learn.
Then, we can simply run the model with this code, where we pass on the KerasClassifier object we built earlier along with the feature and target vectors. The important parameter here is the cv where we pass the kfold object we built above. This tells the cross_val_score estimator to run the Keras model with the data provided, in a 10-fold Stratified cross-validation setting.
The output cv_results is a simple Numpy array of all the accuracy scores. Why accuracy? Because that’s what we chose as the metric in our model compiling process. We could have chosen any other classification metric like precision, recall, etc. and, in that case, that metric would have been calculated and stored in the cv_results array.
model.compile(loss='binary_crossentropy',
optimizer='adam',
metrics=['accuracy'])
We can easily calculate the average and standard deviation of the 10-fold CV run to estimate the stability of the model predictions. This is one of the primary utilities of a cross-validation run.
Beefing up the model creation function for grid search
Exhaustive (or randomized) grid search is often a common practice for hyperparameter tuning or to gain insights into the working of a machine learning model. Deep learning models, being endowed with a lot of hyperparameters, are prime candidates for such a systematic search.
In this example, we will search over the following hyperparameters,
activation function
optimizer type
initialization method
batch size
number of epochs
Needless to say that we have to add the first three of these parameters to our model definition.
Then, we create the same KerasClassifier object as before,
The search space
We decide to make the exhaustive hyperparameter search space size as 3×3×3×3×3=243.
Note that the actual number of Keras runs will also depend on the number of cross-validation we choose, as cross-validation will be used for each of these combinations.
Here are the choices,
That’s a lot of dimensions to search over!
Image source: Pixabay
Enmeshing Scikit-learn GridSearchCV with Keras
We have to create a dictionary of search parameters and pass it on to the Scikit-learn GridSearchCV estimator. Here is the code,
By default, GridSearchCV runs a 5-fold cross-validation if the cv parameter is not specified explicitly (from Scikit-learn v0.22 onwards). Here, we keep it at 3 for reducing the total number of runs.
It is advisable to set the verbosity of GridSearchCV to 2 to keep a visual track of what’s going on. Remember to keep the verbose=0 for the main KerasClassifier class though, as you probably don't want to display all the gory details of training individual epochs.
Then, just fit!
As we all have come to appreciate the beautifully uniform API of Scikit-learn, it is the time to call upon that power and just say fit to search through the whole space!
Image source: Pixabay
Grab a cup of coffee because this may take a while depending on the deep learning model architecture, dataset size, search space complexity, and your hardware configuration.
In total, there will be 729 fittings of the model, 3 cross-validation runs for each of the 243 parametric combinations.
If you don’t like full grid search, you can always try the random grid search from Scikit-learn stable!
How does the result look like? Just like you expect from a Scikit-learn estimator, with all the goodies stored for your exploration.
What can you do with the result?
You can explore and analyze the results in a number of ways based on your research interest or business goal.
What’s the combination of the best accuracy?
This is probably on the top of your mind. Just print it using the best_score_ and best_params_ attributes from the GridSearchCV estimator.
We did the initial 10-fold cross-validation using ReLU activation and Adam optimizer and got an average accuracy of 0.691. After doing an exhaustive grid search, we discover that tanh activation and rmsprop optimizer could have been better choices for this problem. We got better accuracy!
Extract all the results in a DataFrame
Many a time, we may want to analyze the statistical nature of the performance of a deep learning model under a wide range of hyperparameters. To that end, it is extremely easy to create a Pandas DataFrame from the grid search results and analyze them further.
Here is the result,
Analyze visually
We can create beautiful visualizations from this dataset to examine and analyze what choice of hyperparameters improves the performance and reduces the variation.
Here is a set of violin plots of the mean accuracy created with Seaborn from the grid search dataset.
Here is another plot,
…it is extremely easy to create a Pandas DataFrame from the grid search results and analyze them further.
Summary and further thoughts
In this article, we went over how to use the powerful Scikit-learn wrapper API, provided by the Keras library, to do 10-fold cross-validation and a hyperparameter grid search for achieving the best accuracy for a binary classification problem.
Using this API, it is possible to enmesh the best tools and techniques of Scikit-learn-based general-purpose ML pipeline and Keras models. This approach definitely has a huge potential to save a practitioner a lot of time and effort from writing custom code for cross-validation, grid search, pipelining with Keras models.
Again, the demo code for this example can be found here. Other related deep learning tutorials can be found in the same repository. Please feel free to star and fork the repository if you like. | https://towardsdatascience.com/are-you-using-the-scikit-learn-wrapper-in-your-keras-deep-learning-model-a3005696ff38 | ['Tirthajyoti Sarkar'] | 2020-09-23 00:53:29.541000+00:00 | ['Machine Learning', 'Artificial Intelligence', 'Deep Learning', 'Technology', 'Data Science'] |
14 | Global IT Asset Management (ITAM) Software Market 2020 Business Strategies — BMC, Microsoft, Symantec, IBM Software, JustSAMIt | IT Asset Management (ITAM) Software Market
The market report titled “IT Asset Management (ITAM) Software Market: Global Industry Analysis, Size, Share, Growth, Trends, and Forecasts 2016–2024” and published by Zion Market Research will put forth a systematizedevaluation of the vital facets of the global IT Asset Management (ITAM) Software market. The report willfunction as a medium for the better assessment of the existing and future situations of the global market. It will be offering a 360-degree framework of the competitive landscape and dynamics of the market and related industries. Further, it entails the major competitors within the market as well as budding companies along with their comprehensive details such as market share on the basis of revenue, demand, high-quality product manufacturers, sales, and service providers. The report will also shed light on the numerous growth prospects dedicated to diverse industries, organizations, suppliers, and associations providing several services and products. The report will offer them buyers with detaileddirectionto the growth in market that would further provide them a competitive edge during the forecast period.
IT Asset Management (ITAM) Software Market research report which provides an in-depth examination of the market scenario regarding market size, share, demand, growth, trends, and forecast for 2020–2026. The report covers the impact analysis of the COVID-19 pandemic. The COVID-19 pandemic has affected export imports, demands, and industry trends and is expected to have an economic impact on the market. The report provides a comprehensive analysis of the impact of the pandemic on the entire industry and provides an overview of a post-COVID-19 market scenario.
The global IT Asset Management (ITAM) Software Market report offers a complete overview of the IT Asset Management (ITAM) Software Market globally. It presents real data and statistics on the inclinations and improvements in global IT Asset Management (ITAM) Software Markets. It also highlights manufacturing, abilities & technologies, and unstable structure of the market. The global IT Asset Management (ITAM) Software Market report elaborates the crucial data along with all important insights related to the current market status.
Request Free Sample Report of IT Asset Management (ITAM) Software Market Report @ https://www.zionmarketresearch.com/sample/it-asset-management-itam-software-market
Our Free Complimentary Sample Report Accommodate a Brief Introduction of the research report, TOC, List of Tables and Figures, Competitive Landscape and Geographic Segmentation, Innovation and Future Developments Based on Research Methodology
Global IT Asset Management (ITAM) Software Market Report covers major market characteristics, size and growth, key segments, regional breakdowns, competitive landscape, market shares, trends and strategies for this market.
Major Market Players Included in this Report is:
BMC, Microsoft, Symantec, IBM Software, JustSAMIt, Attachmate, Samanage, Scalable Software, Freshservice, and Hewlett Packard. Other players dominating the global market include Deloitte, Spiceworks, Lansweeper, Real Asset Management, InvGate, LabTech, StacksWare, Auvik, eAbax, INSPUR, ManageEngine, Chevin FleetWave, and Atlassian.
The global IT Asset Management (ITAM) Software Market report offers a knowledge-based summary of the global IT Asset Management (ITAM) Software Market. It demonstrates the new players entering the global IT Asset Management (ITAM) Software Market. It emphasizes the basic summary of the global IT Asset Management (ITAM) Software Market. The perfect demonstration of the most recent improvements and new industrial explanations offers our customer a free hand to build up avant-garde products and advanced techniques that will contribute in offering more efficient services.
The report analyzes the key elements such as demand, growth rate, cost, capacity utilization, import, margin, and production of the global market players. A number of the factors are considered to analyze the global IT Asset Management (ITAM) Software Market. The global IT Asset Management (ITAM) Software Market report demonstrates details of different sections and sub-sections of the global IT Asset Management (ITAM) Software Market on the basis of topographical regions. The report provides a detailed analysis of the key elements such as developments, trends, projections, drivers, and market growth of the global IT Asset Management (ITAM) Software Market. It also offers details of the factors directly impacting on the growth of the global IT Asset Management (ITAM) Software Market. It covers the fundamental ideas related to the growth and the management of the global IT Asset Management (ITAM) Software Market.
Download Free PDF Report Brochure @ https://www.zionmarketresearch.com/requestbrochure/it-asset-management-itam-software-market
Note — In order to provide more accurate market forecast, all our reports will be updated before delivery by considering the impact of COVID-19.
(*If you have any special requirements, please let us know and we will offer you the report as you want.)
The global IT Asset Management (ITAM) Software Market research report highlights most of the data gathered in the form of tables, pictures, and graphs. This presentation helps the user to understand the details of the global IT Asset Management (ITAM) Software Market in an easy way. The global IT Asset Management (ITAM) Software Market report research study emphasizes the top contributors to the global IT Asset Management (ITAM) Software Market. It also offers ideas to the market players assisting them to make strategic moves and develop and expand their businesses successfully.
Promising Regions & Countries Mentioned In The IT Asset Management (ITAM) Software Market Report:
North America ( United States)
Europe ( Germany, France, UK)
Asia-Pacific ( China, Japan, India)
Latin America ( Brazil)
The Middle East & Africa
Inquire more about this report @ https://www.zionmarketresearch.com/inquiry/it-asset-management-itam-software-market
Highlights of Global Market Research Report:
Show the market by type and application, with sales market share and growth rate by type, application
IT Asset Management (ITAM) Software Market forecast, by regions, type and application, with sales and revenue, from 2019 to 2026
Define industry introduction, IT Asset Management (ITAM) Software Market overview, market opportunities, product scope, market risk, market driving force;
Analyse the top manufacturers of IT Asset Management (ITAM) Software Market Industry, with sales, revenue, and price
Display the competitive situation among the top manufacturers, with sales, revenue and market share
Request the coronavirus impact analysis across industries and market
Request impact analysis on this market @ https://www.zionmarketresearch.com/toc/it-asset-management-itam-software-market
The report’s major objectives include:
To establish a comprehensive, factual, annually-updated and cost-effective information based on performance, capabilities, goals and strategies of the world’s leading companies.
To help current suppliers realistically assess their financial, marketing and technological capabilities vis-a-vis leading competitors.
To assist potential market entrants in evaluating prospective acquisitions and joint venture candidates.
To complement organizations’ internal competitor information gathering efforts by providing strategic analysis, data interpretation and insight.
To identify the least competitive market niches with significant growth potential.
Global IT Asset Management (ITAM) Software Market Report Provides Comprehensive Analysis of:
IT Asset Management (ITAM) Software Market industry diagram
Up and Downstream industry investigation
Economy effect features diagnosis
Channels and speculation plausibility
Market contest by Players
Improvement recommendations examination
Also, Research Report Examines:
Competitive companies and manufacturers in global market
By Product Type, Applications & Growth Factors
Industry Status and Outlook for Major Applications / End Users / Usage Area
Thanks for reading this article; you can also get individual chapter wise section or region wise report version like North America, Europe or Asia. | https://medium.com/@torasep/global-it-asset-management-itam-software-market-2020-business-strategies-bmc-microsoft-9ba6799be9f7 | ['Prakash Torase'] | 2020-12-03 09:11:00.308000+00:00 | ['Market Research Reports', 'Technology', 'Software', 'Development', 'Management And Leadership'] |
15 | Great article, totally agree with that! Thanks for sharing! | Great article, totally agree with that! Thanks for sharing! | https://medium.com/@bryan-dijkhuizen/great-article-totally-agree-with-that-thanks-for-sharing-c53df300462d | ['Bryan Dijkhuizen'] | 2020-12-22 22:47:08.500000+00:00 | ['Opinion', 'Technology', 'Software Development', 'Programming', 'Data Science'] |
16 | Creating real opportunities out of technology | I never had technology as my main focus of work, even though I have been working on it for almost a decade, I was never really aware of the great transformation that it implies for society. Only in the last couple of years, I have been able to see the bigger picture by putting all the energies to be an entrepreneur.
I think the exponential growth that technology has had in the last decade caught us all by surprise, and it has never been easy to adopt this changes since it implies restructuring the way you think; however, people that want to make a change in the world for good is embracing it and moving forward to a future where technology helps humankind to live better. Furthermore, this crazy idea that I started with a group of friends called Trascender Global is also surfing that wave.
Today, in a more-than-a-year of a pandemic and big changes in our lives, I want to talk about how a small idea can seize the technological opportunities to bring innovation in three big fronts: the redefinition of the work concept, the creation of opportunities in a region that desperately needs them and the empowering of people to entrepreneur and venture against all uncertainty.
Redefining work and teleworking
Teleworking has been in everyone’s mouth during this pandemic, and it has been one of the biggest challenges companies have had to endure. However, technology has once again come to the rescue: videoconferencing platforms, remote learning, online schedulers, task managers, automated to-do lists, and all of the tools we take for granted now had allowed us not just to survive, but to maintain steady growth.
However, for a Freelance, this is nothing new. On the contrary, this has been their day to day lifestyle for years: a freelance is used to do home-office, balance a productive work schedule with a healthy personal life, accommodate to tight deadlines and sudden work changes, constantly “reinvent” themselves with new courses that complement their skills and build a multidisciplinary contact network.
This type of philosophy and lifestyle is what Trascender is based on: even if we’re not physically close, or have never met beyond a screen, we can still communicate effectively, reach our targets, deliver projects, and build fruitful relationships with both our customers and our crew of Trascendentales.
Bringing innovative opportunities
Colombia and Latin America are full of splendid talented people that due to the economic and social difficulties never get to work in what they envision or dream of; also many of the tech roles are not equally paid when compared to the earnings in developed countries. Our situation is complex, and further than explaining or trying to justify that situation, we focus on how we can improve it and provide better opportunities to people in our region.
Trascender Global focuses on working with clients in a B2B model, creating solutions in the Data Science industry, which go from analytics and descriptive analysis in Business Intelligence, to Machine Learning and Artificial Intelligence innovation in different industries and economic sectors. We do that by transferring and applying state-of-the-art knowledge on data into need-tailored solutions for business in foreign countries. The beautiful thing is that those solutions are being implemented by people all around our country. We’re expanding now in different Latin American countries — with baby steps though — allowing those people to feel that they’re relevant by knowing that even if they’re in a small town, they can do artificial intelligence for a big multinational corporation.
We urge our people, our crew -which we call Trascendentales-, to strive, excel and realize that we can solve unsolved issues with our talent and capacity. And in the process, they learn not only about Data Science, Artificial Intelligence and other areas, but also about leadership, self-esteem and even other languages and cultures.
Encouraging the new era of entrepreneurs
Many believe that entrepreneurship is some rocket science: “It’s too hard”, “it’s too risky”, “why don’t you get a real job?” are some of the comments that discourage many people of pursuing this way of life, making them feel that they’re taking the wrong path.
In reality, to become an entrepreneur, only one thing is required: to have an idea. When the idea is nurtured, it becomes a vision, then an objective, a purpose, a hypothesis, and sometimes, a business model. All great companies started like that, with the seed of an idea, nurtured by passion and true self-belief.
However, every entrepreneur must also bear the humbling feeling of knowing that you can fail, that your idea might not be “a million-dollar idea”, but you must be willing to let experience and experimentation guide you, to fall and rise, to adapt and overcome, over and over, until you reach that turning point where everything falls into place.
In Trascender Global, we’re passionate about that process: that journey of self-discovery and evolution that every entrepreneur must go through is one of the most valuable growing experiences ever. We look forward to supporting all the ideas that come our way, to guide them in this path, to put our experience in the table and serve as a valuable ally, to grow together and make our crew ideas transcend. It is a huge shot where we create new staff groups within our crew to develop product or services, and soon you will be hearing from some of them.
We like to call this methodology “Monkeys” because just like an idea that moves around in your head, a monkey hangs in a tree and moves swiftly from branch to branch. Still, it has the capacity of evolving, use tools and language, to have its learning process, continually improving, until it can walk by its own. Oh, and also, because of Night Monkey, from Spiderman. Spiderman is cool 🤣.
What’s ahead for us?
Trascender Global dreams of following the steps of big tech companies and organizations such as Globant, PSL, Google, IBM, Accenture and others. We want to lead the technological path in Colombia and Latin America, continuing with the development of tailored solutions and new products in this huge world of data and technology, harnessing the power of Artificial Intelligence, Business Intelligence and Software Engineering.
Achieving this will allow us to continue innovating on those three fronts, bringing more and better opportunities for the people and the economy in our region.
Despite being a small company, we now see ourselves as a big and remarkable organization in the not so distant future. Trascender Global is not just me, a single person, but an idea made of all the dreams of people that somehow believe in us, and that force will continue to be developed even when we’re gone. Precisely, that’s what Trascender mean: to transcend. Feel free to share with us your thoughts, to share these words or even to join us in this pursuit.
A bid farewell
Andrés Martiliano, CEO of Trascender Global
Thanks for coming this far. You can find me, Andrés Martiliano-Martínez, in LinkedIn if you want to continue this conversation. I’m a mix between extrovert and introvert driven by strong passion of seeing dreams come true and I hope you take something valuable of our short yet beautiful story. See you soon! | https://medium.com/@trascenderglobal/creating-real-opportunities-out-of-technology-32ad86002449 | ['Trascender Global'] | 2021-06-08 23:49:58.450000+00:00 | ['Entrepreneurship', 'Technology', 'Opportunity', 'Backstory'] |
17 | Top Content Service Platform | Top Content Service Platform
Today, the growth of content services is driving the evolution of enterprise content management (ECM). Consequently, leading organizations are deploying content services platforms (CSPs) to create a unified information management environment that can connect, share, and govern content across the entire enterprise, and build on the strengths of their ECM deployments. The development of CSPs has been focused on delivering content in a way that meets the needs of both sides — the digital workspace and digital business — of digital transformation. Within the digital workplace, CSPs are based around contextual content delivery, sharing, and collaboration to enable users in accessing relevant content from the application or device that best suits their needs. Besides, with the cloud-based nature of CSPs, businesses can overcome barriers in content storage and distribution of critical information.
With a comprehensive understanding of this new development, CIOReview has compiled a list of the 20 most promising content services platform providers to guide organizations in harnessing the power of new-age content management technologies and ensure the effective automation of content-related processes. With several innovative technological capabilities and success stories up its sleeves, these firms are constantly proving their mettle in the field of content management. We hope this edition of CIOReview helps you build the partnership that you and your firm need to foster a unified and efficient information management environment.
We present to you CIOReview’s “20 Most Promising Content Services Platform Solution Providers ”
Top Content Service Platform
Computhink’s Enterprise Content Management solution Contentverse is an exception. Computhink provides highly secure implementation options with multiple choices of secure client access, mission critical application integration, and secure access for a wide range of business environments. No organization in need of a versatile, easy-to-use ECM has to settle for less.”
Founded in 1994, ImageSource is a privately held corporation headquartered in Olympia, WA. Through a number of strategic acquisitions and mergers, ImageSource has positioned itself as a leader among Enterprise Content Management integrators. ImageSource is the manufacturer of ILINX software and a recognized leader in consulting, strategic analysis and rapid application deployment of Enterprise Content Management (ECM) solutions. Utilizing expert teams and a broad portfolio of world leading ECM technology, the company improves critical business processes that leverage the management of data, documents and integration with legacy software systems
KnowledgeLake provides its customers with a PaaS cloud platform as a part of their managed services. Its fully managed services help the client migrate data to the cloud, perform necessary software configurations, define jobs that need automation, capture and store documents, and various other things. The company does all the mentioned jobs without requiring any IT administration. Services offered by KnowledgeLake are entirely cloud-based. Its clients can involve themselves in the migration and any content service operation at any point of time they wish
The company provides corporate across the world with powerful digital content publishing platforms enabling them to create and distribute rich, interactive digital content on any device. The tools are aimed at various businesses from media, corporations to book publishers. The company clients include world-leading business giants. The offices are located in both Paris and Montpellier, but the company partners and collaborators are from all over the world. Being part of the global internet services company Rakuten enables to innovate and provide the clients with genuinely cutting edge and localized services
Appatura
Industry leaders since 1993 Appatura, innovates with its data-driven experts in Enterprise Content Management (ECM). The company empowers its clients to leverage and repurpose the content across the enterprise via products like DocuBuilder. Headquartered in Ridgefield Park, New Jersey, Appatura evolves consistently by updating its technology to meet and exceed industry requirements. They stay ahead of the curve and always bring new expertise to the table by establishing and maintaining a good partnership with their clients. Appatura’s host of solutions develops an integrated hub that enables organizations to break down content like LEGO blocks to an atomic level and then transform them into creative assets, improving productivity, ensuring compliance, and creating workplace efficiencies
Aptara
Headquartered just outside of Washington, D.C, and serving the global 10 largest publishers and Fortune 100 companies, Aptara transforms content for engaging and monetizing new Digital and Corporate Learning audiences. The company’s full-service content production accelerates information providers’ transition from print to digital. Starting from creation and design to new media enhancements and output for all mobile devices and platforms, Aptara produces innovative digital products that deliver content on how, when, and where readers want it while giving content providers renewed agility and revenue opportunities. Aptara’s host of solutions include Digital Content Development, Corporate Learning and Performance, Custom Content Services, Customer Lifestyle Management, Healthcare RCM, IT Services, and Learning Administration Services
Box
Founded in 2005, Box is a Cloud Content Management company that enables enterprises to revolutionize how they work by safely connecting their people, information, and applications. Headquartered in Redwood City, CA, the company offers various products such as cloud content management, security and compliance, collaboration, workflow, integrations, developer tools and APIs, and IT administration and controls. Box consulting helps businesses to transform the organization by focusing on accelerating the time to market and helping companies to unlock their full potential from successful implementations to more sophisticated technical integrations and adoption programs. Box transforms a customized plan that provides a team, necessary tools, and experience to bring an organization into the digital age quickly
Everteam
Having 25 years of experience in the field of Information Governance and Enterprise Content Management, Everteam offers a complete content services platform for Information Governance. Everteam solutions enable enterprises to build and manage content-driven processes that support a range of business opportunities. The company’s content management solutions include web app studio, document management, and digital assists management. The web app studio is a rapid application development platform that empowers the client to develop automated processes with minimal IT department dependency. Document management efficiently builds, amanges, and structures corporate repositories helping customers reduce the amount of time and effort spent on managing corporate documents. Data storage improves the business, to become agile, and drive online customer engagement
Interfy
Headquarters in Orlando, Florida, Interfy believes in the power of digital transformation. To help companies in various industries reach this level, it operates in the organization and digitization of content and processes in the cloud. Today, it is the only one in the industry to provide 100 percent integrated ECM, BPM, BI, PMS, CRM, and ERP tools, built into a single platform and accessible with a single login. The company develops software platforms to accelerate digital business transformation, enabling better information management in an affordable and customizable way. With their team of professionals who are passionate about creating innovative technologies, they have over two decades of experience developing software for corporate management, processes, and content
Laserfiche
Headquarters in Long Beach, California, Laserfiche Enterprise Content Management transforms the way organizations manage content, make timely, automate document-driven business processes and informed decisions. Leveraging Laserfiche, organizations can innovate how unstructured information and documents are processed and analyzed to extract business results. Laserfiche offers intuitive solutions for capture, workflow, case management, electronic forms, cloud, government-certified records management, and mobile. The company with its team of experts provides Enterprise Content Management (ECM) software, Document Management (DM), Records Management (RM), Business Process Management (BPM), Electronic Forms, Process Automation, Workflow, Case Management
MODX
Headquartered in Dallas, MODX is a company that backs the open source Content Management System and Web Application Framework. MODX Revolution is the world’s fastest, most secure, flexible, and scalable Open Source CMS. Their cloud platform, MODX Cloud, is the ultimate hosting for modern PHP applications, especially MODX. The company provides the fastest, most flexible, scalable, and secure, open-source CMS. Unlike other CMS platform plugins, MODX Extras are designed to be easily tailored to your specific design and requirements, and it connects to the tools the client already uses and easily integrates with anything that has an API
Newgen Software Technologies
A provider of Business Process Management (BPM), Customer Communication Management (CCM), Enterprise Content Management (ECM), Document Management System (DMS), Workflow and Process Automation software Newgen Software was founded in 1992. The company has extensive, mission-critical solutions that have been deployed in Banks, Insurance firms, BPO’s, Healthcare Organizations, Government, and Telecom Companies. Newgen, through its proven Business Process Management, Enterprise Content Management, Customer Communications Management, and Case Management platforms, brings about the perfect amalgamation of information/content, technology, and processes, the building blocks of Digital Transformation. Newgen’s commitment is towards enabling clients to achieve greater agility when it comes to transforming processes, managing information, enhancing overall customer satisfaction, and driving enterprise profitability
Nuxeo
Nuxeo, the maker of the leading, cloud-native content services platform, is reinventing enterprise content and digital asset management. Nuxeo is fundamentally changing how people work with both data and content to realize new value from digital information. Its cloud-native, hyper-scalable content services platform has been deployed by large enterprises, mid-sized businesses, and government agencies worldwide. Founded in 2008, the company is based in New York with offices across the United States and Europe. The company with a team of experts specializes in content management software, enterprise content management, document management, application platform, content management platform, digital asset management, case management, and content repository
OpenText
OpenText, The Information Company™, a market leader in Enterprise Information Management software and solutions, enables intelligent and connected enterprises by managing, leveraging, securing and gaining insight into enterprise information, on-premises or in the cloud. The company was founded in the year 1991 and is based in Waterloo, ON. From its inception, the company with a team of experts specializes in providing Enterprise Information Management, Business Process Management, Enterprise Content Management, Information Exchange, Customer Experience Management, and Information Discovery. The company’s EIM products enable businesses to grow faster, lower operational costs, and reduce information governance and security risks by improving business insight, impact, and process speed
Primero Systems
At Primero Systems, the company provides software solutions to enterprises. More specifically, they understand enterprise software and the role it plays in helping to maintain a competitive edge and grow the business. The company is a trusted software development firm that has been delivering innovative solutions ranging from enterprise-grade custom software to web content management systems for more than 25 years. The company wants to help businesses achieve their goals and maximize their potential through software. Integrity, Competency, and Quality are the words that shape the company, and we develop and support our products and services with them in mind
Simflofy
Simflofy’s content integration platform is used for creating federated content networks and migrating content from a legacy content source. The platform has connecters to over 20 of the most common sources for content. They have developed solutions on the platform for federated search, in-place records management, and email archiving. Solutions based on the content integration platform have been used by millions of users and work with billions of documents. The solutions have been used in various industries, including healthcare, insurance, consulting, and state and federal government
Systemware
Founded in 1981, Systemware quickly established itself as a developer of products that effectively managed enterprise content. The solutions can help with report output management, correspondence, and customer communications. With more than 35 years in information management, Systemware solutions continue to support the changing environment of content while continuing to respond to the evolving needs of customers, both large and small. Systemware has developed a suite of applications that streamline the flow of documents and information through a variety of business processes. Systemware currently serves customers in a variety of industries ranging from financial services, insurance, healthcare, and retail markets
Veritone
Veritone is a provider of artificial intelligence (AI) technology and solutions. The company’s proprietary operating system, aiWARE, orchestrates an expanding ecosystem of machine learning models to transform audio, video, and other data sources into actionable intelligence. aiWARE can be deployed in several environments and configurations to meet customers’ needs. Its open architecture enables customers in the media and entertainment, legal and compliance, and government sectors to easily deploy applications that leverage the power of AI to improve operational efficiency and effectiveness dramatically. Veritone is headquartered in Costa Mesa, California, with over 300 employees and has offices in Denver, London, New York, San Diego, and Seattle
Vodori
Vodori is an innovative technology company creating cloud-based software that revolutionizes how life science companies (such as Acacia Pharma, Pentax Medical, and Tris Pharma) move regulated content from ideation through review, approval, and distribution. Customers love the unparalleled product design and usability, decades of industry knowledge, and world-class customer service with a combination anywhere else. The company specializes in providing Digital Marketing, Online Strategy, User Experience, Visual Design, Software Development, Enterprise Content Management, and Promotion Review System. The company is based in Chicago, Illinois
Zorang
Zorang is a solution provider with a unique blend of experience and passion for creating Product Information/Content Management, eCommerce, Digital Marketing, Cloud Integration, and Digital Asset Management solutions providing a compelling end to end-user experience for our customers. They have more than a decade of experience in helping the enterprises move to Cloud technologies and integrating their Cloud infrastructure with other enterprise-wide legacy applications. The company helps to strategize how to get to the cloud and how SaaS-based systems will integrate into the overall IT architecture to give the business and marketing an edge.
See More | https://medium.com/@techmag1/top-content-service-platform-803a78a1edfd | ['Technology Magazine'] | 2020-11-19 12:03:44.159000+00:00 | ['Content Management', 'Content Marketing', 'Technology News', 'Technology', 'Magazine'] |
18 | The Simple Tool That Completely Changed How I Work With Remote Engineering Teams | Originally published on www.Ben-Staples.com
If we ignore the big flaming dumpster fire that is COVID for a moment driving almost all technology companies to go 100% remote, in the digital age more and more software development is being done by geographically distributed teams. For example, I work as a Product Manager for Nordstrom based out of Chicago IL. I currently have the honor of working with 10 total software engineers. 7 of them are based in Seattle, and 3 of them are based all the way in the Ukraine.
Working with a geographically diverse group of engineers is awesome. It brings new perspectives, and new team dynamics that on the whole build towards a stronger product. Geographically diverse teams result in individuals bringing different ways of thinking about problems and solutions to the table. Of course it is not all rainbows and butterflies. There can be challenges, especially for teams as they are forming. For example, you need to adjust meeting cadence to if possible so the time works for all team members.
Oftentimes, cultural differences can cause significantly separate needs for how feedback can be delivered or received. For example, I find that for many engineers based out of European countries, their preference is blunt and overly direct feedback.
Part of this is a language thing, if teams don’t primarily speak the same language, sometimes flowery style adjectives that people tend to use (*cough* me) to try to better describe the sentiment behind a piece of feedback can get lost. However I personally attribute the majority of this preference in feedback delivery to just different cultural norms. Some folks and / or cultures bias towards the most direct feedback possible, while others take a little more finesse.
Time is not your friend with geographically diverse engineers; it impacts feedback loops, which impacts value to the customer.
One of the biggest drivers of behavior change when moving from a team of engineers all working from one location to geographic distribution can be of course differences in time zone. At Nordstrom we have about 2–3 hours of overlap with our Kyiv engineers. That means that for an engineer based in the same time zone as you, you have a whole 8 hours where you can talk through a feature, time for the engineer make changes, and time for design and product provide feedback about those changes.
THEN, the developer has time to react to those changes and get a second version stood up and ready to go before the day ends. As a result, if you have an engineer, a product manager, and a designer all in the same time zone, you have 8 total working hours all overlapping where feedback, changes, and iterations can be produced.
HOWEVER if you have an engineer based in a different time zone, you’ve got significantly less hours of overlap.
So instead of having multiple opportunities to ideate, build, get feedback, and ideate again most days you can only have one feedback cycle if your team has an overlap of 2 working hours. Feedback cycles are important, check out this article on the difference between Kanban and Scrum to see some examples of how this comes to life.
This is in no way saying that less overlap in hours means less gets done. That engineer based in another time zone still works a full 8 hours, and is still an awesome engineer with a great skill set. It just means that the active time you have for feedback loops is condensed significantly.
Not only that, but you also need to think about the quality of your feedback and the amount of data you can convey in each round. With an in person engineering team, you get to talk directly, convey not only the data and information around the feature you’re implementing, but also emotional reactions to different things that are said.
Not all feedback formats are created equal.
Now, in the time of COVID, a bunch of teams are working remotely. So we receive a little less data than you would get in person, but you can still see peoples reactions, can still talk through problems and share your screen.
However when you are working in off hours (I.E. the time when you are working but your engineering team is not because it is their nighttime), many people rely on text to convey feedback about a feature or idea.
This is the big mistake
Most of the time this comes in the form of Jira ticket comments, or slack messages. If you think about the amount of data or information you can convey at a time, the highest quality of data delivery is in person. Next would be over video chat, and the very last form would be forms of text based communication. It is just so much less efficient.
Showing what something looks like is much more impactful than telling someone what it looks like. Think about presenting anything; instead of having a slide full of text, show a picture or video and speak to what is happening. I don’t need to ask you to consider which is more impactful.
So we have established that the faster your feedback loop, the more value you will deliver to the customer. And the more time you have working the same daylight hours as your team, the more opportunities you will have to provide feedback. The more feedback and the faster the frequency of delivery, the more value delivered to the end customer. And thus, if you are working with a team in a different time zone, you have less overlap in working hours, and as a result less total time that you can give and receive feedback on features your team is working on.
Not great, but there has got to be some sort of a solution to help!
There is!!
Screen recordings.
Video recordings do take just a little bit more time to make than something like a comment on a Jira ticket, but not much more, and the impact this can have in terms of how much is conveyed for feedback is incredible.
Photo by Soundtrap on Unsplash
All you need to do is get a basic screen recorder and a microphone (which of course you already have if you’re working from home). I use a chrome extension literally called “Screen Recorder”. It is free, and I can quickly record my screen, include the cursor, and add my voiceover.
So now, I am trying to build the habit when working with an engineer in another time zone on any sort of feature, of making a screen recording and walking through specific points of feedback. It conveys so much more, walking through, pointing out specific features, points of confusion, etc. This provides the engineer with a ton more context on changes you are asking for, or updates needed. The end result is that using screen recordings and a voice over is a pretty good attempt at trying to fit the information that would normally be contained in multiple rounds of feedback into one rich method of information delivery.
Does this solve the problem of just not having as much working hour overlap with offshore engineers? No. But what screen recordings do is they significantly deepen the level of feedback normally provided so that you can convey more in a shorter amount of time.
It takes no money, takes almost no additional time, and I would highly recommend anyone working with any engineering team at all try this out and see how it goes.
About the author:
Ben Staples has over 7 years of product management and product marketing eCommerce experience. He is currently employed at Nordstrom as a Senior Product Manager responsible for their product pages on Nordstrom.com. Previously, Ben was a Senior Product Manager for Trunk Club responsible for their iOS and Android apps. Ben started his Product career as a Product Manager for Vistaprint where he was responsible for their cart and Checkout experiences. Before leaving Vistaprint, Ben founded the Vistaprint Product Management guild with over 40 members. Learn more at www.Ben-Staples.com
I do product management consulting! Interested in finding out more? Want to get notified when my next product article comes out? Interested in getting into Product Management but don’t know how? Want even more book recommendations?! Contact me! | https://medium.com/work-today/the-simple-tool-that-completely-changed-how-i-work-with-remote-engineering-teams-b0eaabac61e1 | ['Ben Staples'] | 2020-12-17 21:40:16.499000+00:00 | ['Product Management', 'Product', 'Technology', 'Agile', 'Tech'] |
19 | Connected Agriculture Market 2021: Explore Top Factors that Will Boost the Global Market by 2026 | Connected Agriculture Market 2021: Explore Top Factors that Will Boost the Global Market by 2026 shubham k Dec 22, 2021·4 min read
Connected Agriculture Market 2021–2026
Straits Research has recently added a new report to its vast depository titled Global Connected Agriculture Market. The report studies vital factors about the Global Connected Agriculture Market that are essential to be understood by existing as well as new market players. The report highlights the essential elements such as market share, profitability, production, sales, manufacturing, advertising, technological advancements, key market players, regional segmentation, and many more crucial aspects related to the Global Connected Agriculture Market.
Connected Agriculture Market
The Major Players Covered in this Report:
Some of the notable players in the connected agriculture market are IBM Corporation, Microsoft Corporation, AT&T, Deere & Company, Oracle Corporation, Iteris, Trimble, Ag, SAP SE, Accenture, Cisco Systems Inc., Decisive Farming, Gamaya, and SatSure.
Get a Sample PDF Report: Connected Agriculture Market
Segmentation is as follows: -
By Component,
Solutions , Platforms , Services
By Application,
Pre-Production Planning and Management , In-Production Planning and Management , Post-Production Planning and Management
The report specifically highlights the Connected Agriculture market share, company profiles, regional outlook, product portfolio, a record of the recent developments, strategic analysis, key players in the market, sales, distribution chain, manufacturing, production, new market entrants as well as existing market players, advertising, brand value, popular products, demand and supply, and other important factors related to the Connected Agriculture market to help the new entrants understand the market scenario better.
Important factors like strategic developments, government regulations, Connected Agriculture market analysis, end-users, target audience, distribution network, branding, product portfolio, market share, threats and barriers, growth drivers, latest trends in the industry are also mentioned.
Regional Analysis For Connected Agriculture Market:
North America (United States, Canada, and Mexico)
Europe (Germany, France, UK, Russia, and Italy)
Asia-Pacific (China, Japan, Korea, India, and Southeast Asia)
South America (Brazil, Argentina, Colombia, etc.)
Middle East and Africa (Saudi Arabia, UAE, Egypt, Nigeria, and South Africa)
In this study, the years considered to estimate the market size of the Connected Agriculture are as follows:
• History Year: 2014–2019
• Base Year: 2019
• Estimated Year: 2020
• Forecast Year 2021 to 2026
The main steps in the investigation process are:
1) The first step in market research is to obtain raw market information from industry experts and direct research analysts using primary and secondary sources.
2) Extracts raw data from these sources to extract valuable insights and analyze them for research purposes.
3) Classify the knowledge gained by qualitative and quantitative data and place it accordingly to make final conclusions.
Key Questions Answered in the Report:
• What is the current scenario of the Global Connected Agriculture Market? How is the market going to prosper throughout the next 6 years?
• What is the impact of COVID-19 on the market? What are the major steps undertaken by the leading players to mitigate the damage caused by COVID-19?
• What are the emerging technologies that are going to profit the market?
• What are the historical and the current sizes of the Global Connected Agriculture Market?
• Which segments are the fastest growing and the largest in the market? What is their market potential?
• What are the driving factors contributing to the market growth during the short, medium, and long term? What are the major challenges and shortcomings that the market is likely to face? How can the market solve the challenges?
• What are the lucrative opportunities for the key players in the Connected Agriculture market?
• Which are the key geographies from the investment perspective?
• What are the major strategies adopted by the leading players to expand their market shares?
• Who are the distributors, traders, and dealers of the Global Connected Agriculture market?
• What are sales, revenue, and price analysis by types and applications of the market?
For More Details On this Report: Connected Agriculture Market
About Us:
Regardless of whether you’re looking at business sectors in the next town or crosswise over continents, we understand the significance of being acquainted with what customers purchase. We overcome the issues of our customers by recognizing and deciphering just the target group, while simultaneously generating leads with the highest precision. We seek to collaborate with our customers to deliver a broad spectrum of results through a blend of market and business research approaches. This approach of using various research and analysis strategies enables us to determine greater insights by eliminating the research costs. Moreover, we’re continually developing, not only with regards to where we measure, or who we measure but in how our visions can enable you to drive cost-effective growth.
Contact Us:
Company Name: Straits Research
Email: [email protected]
Phone:
+1 646 480 7505 (U.S.)
+91 8087085354 (India)
+44 208 068 9665 (U.K.) | https://medium.com/@shubhamk.straitsresearch/connected-agriculture-market-2021-explore-top-factors-that-will-boost-the-global-market-by-2026-cba6ed0a696f | ['Shubham K'] | 2021-12-22 05:28:28.741000+00:00 | ['Agriculture', 'Market Research Reports', 'Agritech', 'Agriculture Technology', 'Connected Agriculture'] |
20 | Q&A with Sherrell Dorsey, Founder of The PLUG | Q&A with Sherrell Dorsey, Founder of The PLUG
This week, The Idea caught up with Sherrell to learn more about The PLUG — a digital platform that covers the black tech sector. Read to learn about The PLUG’s data-driven journalism, how the outlet plans to close the gap between journalists and black innovation communities, and why Sherrell thinks media outlets struggle to draw diverse audiences. Subscribe to our newsletter on the business of media for more interviews and weekly news and analysis.
Tell me about The PLUG.
The PLUG is a subscription-based digital news and insights platform. We cover the black innovation economy and report on black tech, investment, ecosystems, workforce development, and anything pertaining to the future of technology from the perspective of black leaders and innovators.
We deliver exclusive stories once per week. We’re increasing this frequency to two times a week in June. We also provide data indexes, reports, and other intelligence on the black tech space like black CIOs and S&P 500s. We also have monthly pro member calls where we feature different leaders in the space. Back in February, we launched a live summit to bring together researchers, entrepreneurs, and investors.
We currently have several thousand subscribers with hundreds of paid members. Most of these pro members actually opt for an annual subscription instead of a quarterly one.
And, how did you come up with the idea?
I actually stumbled across the idea accidentally. I have always worked in tech and one of the challenges for me was not seeing folks that looked like me in the tech scene in the media I was consuming. I grew up in Seattle, so I was constantly around black engineers, black programmers, black network administrators. It bothered me that I wasn’t seeing that reflected in the material that I was reading and that there wasn’t a rigorous and analytical look at what people of color are doing in the tech space
In 2008, I started a blog and began writing about black tech innovators who were developing models to provide formerly incarcerated people with employment by training them in solar panel installation. At the time I didn’t have a journalism background, but I was calling publications asking to write about these topics. I found that as I was building up my freelance repertoire, people began reaching out to me because of my knowledge of this space. This inspired me to launch a daily tech newsletter in 2016. Our readers not only included black techies at companies like Google, Facebook, and Amazon but also investors and other tech reports who felt that they had been missing a diversity lens on tech.
The newsletter was getting traction and advertisers like Capital One and Goldman Sachs started reaching out to me asking to connect with my audience, which eventually allowed me to grow the newsletter into a platform that provides readers with highly data-driven journalism.
How has the platform’s business structure evolved since its inception?
When I first started, it was just me, my laptop, and my WiFi. Then, when Capital One and other sponsors came on board, I was able to grow revenue and start doing some testing on the structure of the business. I would also go on to secure a six-month publishing partnership with Vice Media where we co-published pieces on diversity and tech onto the tech vertical they had at the time, Motherboard.
It quickly became apparent, however, that advertising isn’t the best way to play the long game. So, I started looking into how The PLUG can build a sustainable subscription model. In 2019, The PLUG participated in The Information’s Accelerator, which is an initiative that supports next-generation news publications. Shortly after, we launched a paid version of The PLUG in July.
Aside from that, we also license our content and publish sponsored content. Every now and then, we also secure grants.
What do you think mainstream outlets get wrong when trying to attract black and brown audiences?
Way too often people treat these audiences as a charity and think that giving them free access will solve the issue. It unsettles me when media leaders treat this issue as a philanthropic initiative. We overestimate how much money factors into this. I grew up in the inner city; people pay for what it is they value at the end of the day, rich or poor. You have to have content that these audiences find valuable. Even if you give it to them for free, the content and coverage are not valued if it does not reflect their community or voice in an authentic way.
Did you consider VC funding?
Yeah, I initially thought I was going to secure VC dollars. But, I found that a lot of the pushback I was getting was “Well, we already invested in another black media outlet.” The real question is: why can there only be one? Do black and brown people not have needs and nuances?
What do you think sets The PLUG apart from other black tech media outlets?
Definitely our depth and analysis — The PLUG has extensive data libraries. For instance, we were the first one to develop a map of all black-owned coworking spaces in the country. We cover topics that no one else is asking questions about.
Unfortunately, there’s no centralized source on black tech, and so The PLUG’s ability to bring this data, comprehensive indexes, and in-depth coverage has allowed us to garner a lot of attention. A talent scout for ABC’s Shark Tank recently told me that they use The PLUG to stay informed on emerging start-ups across the nation.
What’s your long-term vision for The PLUG?
I’d like to offer more city-specific reportage on black innovation communities across the country and the world and build a global network of reports.
I’d also like to move into more research-based initiatives to help fuel academic research, investor analysis, and government policy on black innovation.
In all honesty, though, I don’t even have a 10-year plan. The impetus behind our work is greater visibility and I hope that in 10 years we don’t have to continue staying niche. My hope is that more businesses and tech publications will cover communities of color with the same diligence and rigor as The PLUG. I hope that this kind of reportage is not seen as ancillary but instead more integrated in tech and business reportage.
And with that, I hope that we get purchased and can grow within the walls of a publisher that recognizes our value and the importance of delivering this information to readers.
RAPID FIRE
What is your first read in the morning?
The Wall Street Journal’s tech news briefing and The Daily Stoic.
What was the last book you consumed?
Arlan Hamilton’s It’s About Damn Time and Kevin Kelly’s The Inevitable.
What job would you be doing if you weren’t in your current role?
This is tricky because I like the grind of building something from the ground up. If I wasn’t working on The Plug, I’d probably be teaching inclusive entrepreneurship at the collegiate level or within vocational training ecosystems. | https://medium.com/the-idea/q-a-with-sherrell-dorsey-founder-of-the-plug-8fd4227440d9 | ['Tesnim Zekeria'] | 2020-05-27 19:38:12.108000+00:00 | ['Journalism', 'Startup', 'Technology', 'Subscriber Spotlight', 'Media'] |
21 | Binance System Upgrade Notice | Heads up, Binancians: the Binance cryptocurrency exchange will undergo a scheduled system upgrade on November 14, 2018, starting at 2:00 AM UTC. This upgrade is expected to go on for eight hours.
During this time, we will suspend deposits, withdrawals and trading. While we don’t want to inconvenience our users as much as possible, we constantly upgrade our platform to give you an even better trading experience, and this upgrade is part of that endeavor.
Here’s a short guide on what you can do before, during, and after the upgrade.
What to do before the Binance system upgrade
Download the Binance app on your smartphone to get push notifications on the latest updates on Binance. Get the app on iOS and Android.
Once you download the app, check your orders if they might be affected by the upgrade. Cancel orders if needed.
What to do during the upgrade
Check our website and official social media pages on Facebook, Twitter, and Instagram to get updates on the upgrade.
Join our official Telegram group and our local communities on Telegram (see a list here). If you have any questions during the maintenance, our team and Angels will be ready to help.
Sit back, relax, and watch a video or 10 on Binance Academy. Like this one:
Contribute to our mission to make the world a better place via blockchain. Donate to the Blockchain Charity Foundation.
Read our in-depth reports about blockchain projects on our newly launched Binance Research.
Stay updated regarding the latest movement and insights in the crypto world through Binance Info.
Do you have an awesome idea for a blockchain project? Better spend those eight hours hashing out your plan and who knows, Binance Labs might think your idea has a lot of potential.
Still itching to get in on some crypto action during the upgrade? Get the Trust Wallet and explore its many features! Download it now on iOS and Android.
What to do once the upgrade is done
Rejoice.
Log in and check the functions of Binance platform. Ask Binance support if needed.
Smile because your funds are #SAFU.
Resume trading.
We’ll keep you updated throughout the course of the system upgrade. Thank you for your support! | https://medium.com/binanceexchange/binance-system-upgrade-notice-c3fdf632bd68 | [] | 2018-11-15 08:24:07.510000+00:00 | ['Blockchain', 'Exchange', 'Binance', 'Technology', 'Cryptocurrency'] |
22 | Why so much talk about Decision Science? | When I first came across this term and tried to learn about it, I was overwhelmed by the information. Everyone has picked it up and have written pages on end about it. There was one thing in common though.
We all have read many articles, blogs and seen posts of ‘decision science vs data science’. We haven’t actually tried to think why do we compare the two. There is more congruence between the two. I would say that the only main difference is that data science delivers the results and decision science helps us take calculative steps based on the results.
What if I asked you to compare the sports, chess and boxing. Many would argue chess as an activity rather than a sport but that is a discussion for some other time. Both require attention and patience and practice and pattern recognition in the opponent. Yet, they are quite different and does not require to set up a comparison between the two. The missing piece that we failed to notice was that they did not affect each other. Whereas, decision science depends a lot on data science.
Imagine this, you are a data scientist dumped with a huge database containing all sorts of data. Your superior asks you to comb through the data and provide him with your findings. We have all been there where based on certain results we can easily decide what to do. That is where the decision scientist comes in. It is not as easy as it sounds.
A decision scientist doesn’t just look at the data provided. There are factors like ‘past-experiences, variety of cognitive biases, individual differences, commitment and some more’. Data Science is a tool by which correct decisions can be made. Read more here.
Whoever said that data science is for the math majors or the computer genius in the class. Basic analysis is required in every field. You do not have to be Alan Turing to go about it. Psychology students learn distributions in their bachelors, biology majors have to understand the normal curve too. In short if data science is everywhere, so is decision science and has been so always.
Photo by Frank Vessia on Unsplash
Since we are just venturing into this field, decision science can quite easily get misunderstood for analytical fields or areas using machine learning, but it is very much a separate area.
“While data science is perhaps the most broadly used term, ‘decision science’ seems like the more fitting description of how machines are assisting business leaders in solving problems that have traditionally relied on human judgment, intuition and experience,” according to K.V. Rao, founder and CEO of sales forecasting software company Aviso, in TechCrunch. “It may not be the sexiest phrase in the world — I’ve never seen it in any marketing materials — but ‘decision science’ aptly encapsulates how computers are helping to systematically identify risks and rewards pertinent to making a business decision.”
As I have mentioned above, according to Data Science Central, “Data Scientist is a specialist involved in finding insights from data after this data has been collected, processed, and structured by data engineer. Decision scientist considers data as a tool to make decisions and solve business problems.”
Photo by Emily Morter on Unsplash
By now, we’re accustomed to the what a great deal being a Data Scientist is. The demand for Data Scientists is only skyrocketing and will be go on maybe till aliens take over. The overlooked champion of the business and technology world have just been recognized. Little do we know about them.
If people were rational, behavioral economics would be a myth. From Economics to Marketing to Psychology, all require a Data Scientist and a Decision Scientist. Intellectuals believe that the combination of two can take the business to great heights.
A data scientist just has their data in hand. They have no knowledge of client needs or promises made by the company or whether the management has some reservations. In short, they do not have a 360-degree view of the problem at hand. Regardless decision scientists have an analytical mind which helps them picture the problem and find the best solution for the company and client/customer. Hence, they are prepared for all kinds of situations.
Being rarer than data scientists, the demand for decision scientists are only to increase. The craze for data science had just started when educational institutions started providing courses. The trend is still increasing. Everyone is gearing up with the latest tools. We do not want to be left behind. Universities have already started courses for students to become decision scientists. It’s a matter of time before schools adopt some concepts.
I have made a list of as many universities as I could find which were offering the course in major countries. Now, most of these are PhD courses as decision science requires a some taste of how things run in the real world. It is a great opportunity as research is still underway.
ASIA:
IIM Bangalore, India
IIM Lucknow, India
INSEAD, Singapore(in all campuses)
CUHK Business School, China
Asia and Pacific decision science institute
EUROPE:
European decision sciences institute
University of Konstanz, Germany
IESE Business School, Spain
Rwth-aachen University, Germany
HEC, Paris
AMERICA:
Decision Science institute
The College of Penn
Carnegie Mellon University
Harvard University
Kellogg School of Management
Thank you for reading!
Do tell me your remarks. | https://medium.com/analytics-vidhya/why-so-much-talk-about-decision-science-f9d8b8bd92d4 | ['Ada Johnson'] | 2020-10-02 17:02:37.434000+00:00 | ['Technology', 'Data Science', 'Decision Science'] |
23 | Outsourcing Work To Laravel Remote Developers: The Dos and Don’ts | Outsourcing Work To Laravel Remote Developers: The Dos and Don’ts
Here’s everything you need to know about outsourcing to Laravel Developers
With digital platforms being popular for expanding business opportunities, more and more business owners are now planning to have a website for their organization.
Nowadays, Laravel being the most promising technology, it is more preferable by the business owners than any other PHP frameworks.
But the question arises when the hiring process of the app development team starts. Organizations always get confused on how to hire. Do they need to opt for outsourcing or in-house developer hiring?
Outsourcing being a cost-effective and straightforward approach, the majority of the business opt for that. Therefore, in this blog, we will look at how outsourcing is the best approach and the dos and don’ts of outsourcing the Laravel developers for remote work.
What is Laravel?
Laravel is a robust and easy to understand open-source PHP framework that follows an MVC design pattern. The Laravel developers reuse other frameworks’ existing components, which can help create a new web app. Therefore, web applications that are designed are more structured.
Laravel is a framework that provides a rich set of functionalities that has some basic features of PHP frameworks in it.
Top Four Features of Laravel
Laravel offers some of the best features, and they are:
1. Web App Security
The best advantage of using Laravel is that it offers security. With the use of Laravel, one can have websites that have almost nil chances of hacking. Laravel enables keeping the database safe by using salted and hashed password mechanisms. This means that the Laravel development company can secure an app in a way that the password of its users would never be saved as plain text. That password will be converted into an unintelligible series of hashing.
2. MVC Architecture
Model-view-controller (MVC) represents the architecture that the developers adopt while creating an application. This architecture separates the application from the user interface. With the use of MVC architecture, the Laravel developer can create a code structure and make it easier for them to work with it.
3. Unit-Testing
One of the main reasons developers choose Laravel to create an app is that it is easier for unit-testing. Laravel is a framework that can run multiple unit tests to make sure that the changes made in the application don’t create a negative effect.
4. Database Migration System
Data migration is nothing but moving data from one platform to another. It seems to be a simple process but migrating the data is not that simple. If not done adequately, there are chances of losing an organization’s valuable data. Because of this reason, developers tend to choose the Laravel framework as it offers the best solutions for data migration.
Why Hire a Laravel Developer?
Laravel is one of the best PHP frameworks in the market that enables app developers to create excellent web applications for their clients. Laravel was launched in 2011 by Taylor Otwell. Being an open-source framework that follows MVC architecture, Laravel has become quite popular amongst web app developers.
In today’s world, where digitalization is taking over every field, everyone aims to take their business or organization on a digital platform. In this digital era, to fulfill your wish to take your business on a digital platform and expand your client base, hiring a Laravel developer is the best thing. Laravel framework is specially designed to create and polish a robust web application for businesses of all sizes.
What is Outsourcing?
Outsourcing is a simple practice of hiring an individual or a team outside your organization to deliver a service or a product. Small businesses prefer outsourcing because they might not have resources for web app development, finance, or marketing. Therefore, they find agencies that specialize in the work they want to get done. On the other hand, big companies choose outsourcing because they have a lot of work going on in their company, so they might not want their employees to create something for themselves. Instead, they prefer their own developers to work on client’s projects. Therefore, they hire outsourcing app development companies and get their work done.
This process of hiring developers outside of your organization for a specific project can be more cost-effective than hiring full-time developers and paying them a salary. Besides this, outsourcing can be the best opportunity for many app development companies to expand their work area.
Why Outsourcing is Better Than In-house Developers?
In-house app development can be much more expensive than outsourcing a developer or app development team. For in-house, the hiring process is a bit longer, and you will have to pay the developer even if he doesn’t have any work to do. While in outsourcing, you only pay for the working developer’s working hours.
Therefore, outsourcing is more preferable over in-house development.
Dos and Don’ts of Hiring Outsourcing Laravel Developers for Remote Work
Laravel frameworks enable developers to develop cheaper, faster, and safer applications. It is becoming one of the most used frameworks because of its features and its benefits for creating an application. Therefore, the majority of the organizations choose to hire a Laravel development company for getting their business web application developed.
When it comes to hiring a developer for a project, the majority of the people go with an outsourcing option. So let’s check out what are the dos and don’ts of hiring an outsourcing Laravel developer to give you a clear idea on whether to go with hiring an outsourcing developer for remote work or not.
Dos of Outsourcing Work to Remote Laravel Developer
1. Simple Switch
When an organization hires a full-time in-house Laravel developer for any particular project, and if things don’t go quite well as expected, the organization might struggle to let go of the employee. In such cases, the company will have to go through a long hiring process for finding a replacement. All these things can cost a lot of time and money. Therefore, hiring an outsourcing Laravel developer is the best option. It is much easier to replace them, they only charge for the hours they have worked on your project, and tracking their workflow is much easier than the in-house development team.
2. Cost-Effective
One of the best things about opting for outsourcing is that it is a cost-effective approach in every sense. Whenever any company or business decides to outsource a service, they neglect the costs of hiring a new developer. This makes quite a difference to the business, especially when it comes to considering overheads. It means that the company doesn’t have to pay a large amount as wage or for office types of equipment. Therefore, outsourcing is regarded as the best option as it is cost-effective.
3. Less Hassle
By outsourcing the Laravel development service providers, one doesn’t have to think about organizing multiple interviews, negotiating on the salary, and providing all the necessary office essentials to the employee. Therefore, hiring a remote outsourcing developer can be a hassle-free process, and as they work from home, there are no extra expenses that an organization has to do.
4. Experts
Many startup business owners try to handle everything on their own to avoid hiring full-time employees and reduce expenses. But it isn’t the best choice. Having an expert for any sort of work is necessary, especially when it comes to developing an application that would represent your business. Therefore, to lessen the expenses, the majority of the entrepreneurs opt for outsourcing the remote developers who are skilled in their job. The professionals can help you create a better product.
5. Save Software Cost
Another beneficial thing in hiring an outsourcing Laravel developer is that the company doesn’t have to provide him with the costly software that they would have to offer the in-house developers. An expert remote developer will have all the software that is required to develop a Laravel application. Therefore, hiring an outsourced developer will save a lot of money on purchasing a software subscription.
Don’ts of Outsourcing Work to Remote Laravel Developer
1. Privacy
When it comes to outsourcing the developer or Laravel application development company for a service, it can risk your organization’s privacy, as they get access to your company’s professional and confidential details.
2. Time Zone
Few business owners resist going with the outsourcing approach because the outsourced developer might be from a different time zone, and this can make it challenging to communicate with them.
3. Communication
When one hires an in-house developer, communication with them to discuss the project becomes more manageable as they will be working in the same location. But when a business owner hires an outsourcing employee from a Laravel development company, communication becomes difficult because of the different time zone. This also delays solving any issue in the project.
4. Legitimacy
Hiring a Laravel developer through online agencies might bring a question of legitimacy. What if the hired developer doesn’t deliver the expected result or misuse the confidential information of your company. These are the reasons why some organizations avoid choosing outsourcing services.
Conclusion
Working with the remote Laravel developers or hiring the outsourcing Laravel app development companies isn’t as complicated and risky as people think. Instead, outsourcing is the best solution as it saves a lot of time and money that goes behind hiring an in-house developer. The above-listed points clearly define that outsourcing Laravel developers for remote work are very beneficial, and this approach is getting more & more popular with each passing day. | https://medium.com/cornertechandmarketing/outsourcing-work-to-laravel-remote-developers-the-dos-and-donts-8da2ed4dbbb4 | ['Roy Daniel'] | 2020-12-10 00:21:52.645000+00:00 | ['Development', 'Technology', 'Remote Developer', 'Laravel', 'Remote Working'] |
24 | Why Transparency Is Critical in Times of Crisis, Part 2: Decision vs. Discussion | As the economy declines, an increasing number of companies are pivoting from peacetime management to wartime management. Renowned entrepreneur and venture capitalist Ben Horowitz wrote a blog post “Peacetime CEO / Wartime CEO” in which he lists the many differences between the two types of company management. In particular, the following caught my eye:
Peacetime CEO focuses on the big picture and empowers her people to make detailed decisions. Wartime CEO cares about a speck of dust on a gnat’s ass if it interferes with the prime directive. Peacetime CEO strives for broad based buy in. Wartime CEO neither indulges consensus-building nor tolerates disagreements.
Most management techniques and principles are catered towards peacetime, to which most of my previous blog posts are no exception. During wartime — or when the company is fending off existential threats — rules tend to be broken. When I served in the Taiwanese Marine Corps, there was a saying: “Getting everyone marching together towards the second best option is better than sitting around and debating what the first best option is.” The same is true for companies fighting an existential threat: sometimes quick decisions are more optimal than perfect ones.
A common mistake I’ve seen leaders make (during both peacetime and wartime) is to invite a discussion around a decision that has already been made. Those who become invested in the discussion may feel betrayed when they realize the decision was already made and their ideas had no impact on the outcome. In my previous post, I noted that transparency is especially important in times of crisis. Today, I want to specifically discuss communicating an already-made decision and the importance during times when the company needs to act quickly.
Discussion or Decision?
During peacetime, organizations are usually afforded more time to make decisions, and leaders prioritize empowering employees by advising thinking things through rather than being hasty. During wartime, decisions have to be made quickly so people can start executing them, which comes at the expense of employee autonomy. This is when clear communication makes a big difference for your employees.
A few years ago, I was on the hiring committee in search of a new VP of Engineering. The committee, chaired by the CTO, defined the job requirements and interview process. Among them were two requirements highlighted by the CTO:
The candidate must be highly technical.
The candidate must have had substantial experience managing engineering teams at a tech company.
The search went on for a few weeks, but none of the candidates we interviewed were qualified. One day, we received a referral for a candidate whose demographic and career background was very similar to that of the two co-founders. The CTO instructed the committee to “go easy on the technical portions, because they may not pass it.” While it was odd, we obliged.
During the post-interview debrief, the CTO asked for the hiring committee’s recommendation. While the candidate seemed solid, two concerns emerged:
We asked the candidate easier technical questions, so we could not properly evaluate their technical abilities against those of other candidates in the pipeline. The candidate spent their entire career in finance and consulting, and their only stint at a tech company lasted merely 10 months.
The first concern was quickly dismissed, since it was the CTO’s direct orders to have us go easy on the technical portion. The second one stirred a debate: the startup had few people who came from the tech industry, and we wanted someone who had experienced high growth at a tech company.
Despite the committee’s concerns, the CTO argued that the candidate had sufficient tech company experience, citing their 10-month stint at a “tech” startup and experience as a CTO/co-founder of another startup. However, upon further inspection, the CTO/co-founder experience completely overlapped with their full-time job and was really a side project the candidate undertook alone. Surprisingly, the CTO kept repeating his original arguments and insisted that we were being too strict. As time for the meeting ran out, the CTO finally disclosed, “Actually, I already gave [them] a verbal offer, so if there are no other concerns, we’ll move forward with this candidate and wine and dine [them].”
The members of the committee were in shock. We had spent the last 30 minutes debating whether or not we should hire this candidate, yet it turned out that the CTO had already made the decision without us. In the ensuing weeks, people were visibly less vocal in hiring committee meetings. We weren’t sure if we were discussing a possibility, listening to the broadcast of a decision, or being asked to give our opinions which may or may not meaningfully influence the final decision.
These types of management-driven decisions are commonly seen across organizations of all sizes. Leaders are certainly justified in making executive decisions during critical wartime. However, not disclosing that a decision was made and inviting a discussion not only makes people feel cheated, but also causes them to lose trust in the system because they don’t know if their opinions still matter.
Decision, Not Discussion
In wartime, it is important to identify when a decision has been made and notifying the team. Being transparent that a decision has already been made allows you to maintain trust, which serves as the foundation for everyone to concentrate on steering the company away from crisis.
Recently, at Airbnb, I was notified of a strategic and urgent project. The deadline was two weeks away, but I was told that since the scope was sufficiently small and they had already found engineers to work on it, all my team had to do was consult on and review code. I told one of my team members to represent our team for this project and provide guidance when necessary. In other words, I let the project team discuss and decide on how to proceed with the project.
Two days later, when no progress was made, I reviewed the product spec and realized there was no way the pre-selected engineers could complete the project in time because they had no prior experience in our codebase. Furthermore, the product scope was actually quite large, and even if my team took it on, we couldn’t have completed the project in two weeks. I had made a mistake by allowing a discussion (between the product manager and the engineers found from other teams) instead of making a decision and implementing it down the line. This cost us two days, leaving eight working days until launch.
I immediately told the product manager (PM) that I was bringing in three of my engineers — pausing their existing work — and cutting project scope to ensure that we could deliver on time. After some back-and-forth with the original project team and the PM, we agreed to a reasonable scope, set up the engineering plan, and were able to deliver the project on time.
When I talked to my engineers, I explained that because this project was critical, I wanted them to pause their current work to ensure this project’s delivery. While I needed their buy-in, I clarified that the decision had been made and that the discussion needed was around how to build these features on time — not whether their existing work should be paused.
One of my engineers noted during a one-on-one that it was “uncharacteristic of [me] to be so heavy-handed in decisions and execution details.” While he was not happy with the decreased autonomy, he appreciated that I explained upfront the criticality of the project, understood why I intentionally “micromanaged”, and agreed that the project would not have succeeded had I let it run its natural course. | https://kenk616.medium.com/why-transparency-is-critical-in-times-of-crisis-part-2-decision-vs-discussion-38ab4a2a7771 | ['Ken Kao'] | 2020-08-04 20:42:35.424000+00:00 | ['Corporate Culture', 'Management', 'Technology', 'Leadership', 'Transparency'] |
25 | Ninja News! : A Weekly Roundup of Cybersecurity News and Updates | A brewing conflict between tech giants
China is in the hot seat as US and allies, including the European Union, the United Kingdom, and NATO, officially blame it for this year’s widespread Microsoft Exchange hacking campaign.
To keep you in the loop, these early 2021 cyberattacks targeted over a quarter of a million Microsoft Exchange servers, belonging to tens of thousands of organizations worldwide.
On a related note, the US Department of Justice (DOJ) indicted four members of the Chinese state-sponsored hacking group known as APT40 last Monday. This is with regard to APT40’s hacking of various companies, universities, and government entities in the US and worldwide between 2011 and 2018.
Could this be tied up to China’s move to develop cyberattacks capable of disrupting US pipeline operations? Hmm..
Just 2 days ago, Wednesday, Cybersecurity and Infrastructure Security Agency (CISA) and the Federal Bureau of Investigation (FBI) issued a joint advisory that Chinese state-sponsored attackers have breached 13 US oil and natural gas (ONG) pipeline companies way back 2011–2013.
This is developing news so be sure to check out our next week’s release!
Finally, we give you a quick overview of the new malwares in the cyber landscape to keep tab on.
MosaicLoader and XLoader, has joined the game
Bitdefender researchers have confirmed a novel malware posing as a cracked software via search engine advertising. MosaicLoader is a malware downloader designed by its creators to deploy more second-stage payloads on infected systems.
A more niche-specific malware which is known to steal information from Windows systems, was also reported this week to have been modified. The “new and improved” malware can now target macOS. This is definitely the upgrade we never want!
The revamped malware is dubbed as XLoader. Sounds like a console right?
That’s a wrap for this week’s happenings on cybersecurity. Never miss important updates on data breaches, new data protection policies, and other techno trends by following Privacy Ninja! | https://medium.com/@privacy-ninja/ninja-news-a-weekly-roundup-of-cybersecurity-news-and-updates-352c365a821e | ['Privacy Ninja'] | 2021-07-24 05:53:49.767000+00:00 | ['Privacy Ninja', 'Cybersecurity', 'Technology News', 'Cybercrime', 'Data Protection'] |
26 | 2019 in review: top 10 digital diplomacy moments | 9. “Tools and Weapons”
Tools and Weapons is a narrative journey from inside of the world’s largest tech companies, with a focus on how technology needs partners working tightly together to face the biggest challenges of our era and to move us forward.
Most of all, this book — released this past September — shows how digital diplomacy is not only about narratives and content, but it’s also about understanding technology and all stakeholders in the tech spectrum and beyond. Digital diplomacy is about social media, as much as it is about policy and diplomacy.
“This is a colorful and insightful insiders’ view of how technology is both empowering and threatening us,” said Walter Isaacson, bestselling author of The Innovators and Steve Jobs. “From privacy to cyberattacks, this timely book is a useful guide for how to navigate the digital future.”
Authors Brad Smith, president and chief legal officer at Microsoft, and Carol Ann Browne, senior director of communications and external relations, understand “that technology is a double-edged sword — unleashing incredible opportunity but raising profound questions about democracy, civil liberties, the future of work and international relations,” said former US Secretary of State Madeleine Albright. “Tools and Weapons tackles these and other questions in a brilliant and accessible manner. It is a timely book that should be read by anyone seeking to understand how we can take advantage of technology’s promise without sacrificing our freedom, privacy or security.”
10. Influence in the Brussels-bubble
Even if you often look at the US as a barometer to understand online influence in politics and government, Brussels is another great example of how social media and digital engagement form the basis of influence on the Internet.
This past summer, POLITICO’s exclusive analysis revealed the influencers in the European Parliament’s Twitterverse. A very interesting time to observe what is happening as the EU legislative body was renewed in May after the Europe-wide elections, and with a new European Commission starting its mandate under President Ursula von der Leyen.
Notably, “the list of international politicians (excluding MEPs) followed by EU parliamentarians is heavily Brussels-dominated,” explain the authors of this report James Randerson and James O'Malley. “It includes just one European leader (Macron), one national politician (Germany’s Martin Schulz, who is a former European Parliament president) and four U.S. politicians (Trump, Obama, Hilary Clinton, and the U.S. president’s institutional account POTUS).”
In September, another interesting study came out, published by ZN Consulting and EurActiv.com. It provides a great picture not just for Brussels-based and European players, but also for Washington DC politicos interested in European affairs and the work of the European Union. | https://medium.com/digital-diplomacy/2019-in-review-top-10-digital-diplomacy-moments-d4e6d9752904 | ['Andreas Sandre'] | 2019-12-22 13:58:34.386000+00:00 | ['Social Media', 'Technology', 'Diplomacy', 'Foreign Policy', 'Year In Review'] |
27 | Dead Bugs Are Getting in the Way of Fully Autonomous Self-Driving Cars | Dead Bugs Are Getting in the Way of Fully Autonomous Self-Driving Cars
Photo: shanecotee/Getty Images
On October 19, 2016, Elon Musk kicked off the lie about Tesla Full Self-Driving capabilities: “The basic news is that all Tesla vehicles exiting the factory have the hardware necessary for Level 5 autonomy. So that in terms of the cameras and compute power, it’s every car we make. So, on the order of 2,000 cars a week are shipping now with Level 5, meaning hardware capable of a full self-driving or driverless capability.”
Don’t get me wrong—I am not among those in the $TSLAQ community on Twitter who believe that everything Musk touches is a massive fraud. While there is much about the accounting around his businesses that is highly questionable and very possibly fraudulent, I think Musk is a true believer in most of the ideas he has brought forth, regardless of how outlandish many seem.
Image: Tesla
But the idea that all Teslas built in the past four years have the hardware necessary to achieve Level 5 (L5) automated driving capability was and is a lie, and for some very mundane reasons.
Just a week after Musk made the above pronouncement, I took to a stage in San Francisco, where I was chairing a conference on automated vehicles. George Hotz of Comma.ai had been scheduled to deliver a keynote that morning. I was checking my notifications as I walked into the event venue and saw the headline that Hotz’s Comma One project had been canceled after the company opted not to respond to an inquiry from the National Highway Traffic Safety Administration asking for more information about it.
Needless to say, Hotz did not appear, so I quickly modified a presentation deck on the state of automated driving development and took his place. In the course of my talk, I asked the audience of about 150 people how many thought that Musk could make good on his L5 promise. I could count the number of people who responded in the affirmative on the fingers of one hand, and then proceeded to give my own explanation of why Musk’s claim was a lie.
I think Musk is a true believer in most of the ideas he has brought forth, regardless of how outlandish many seem.
I often get calls from the media for comment on all aspects of mobility technology, including automated driving and electrification, and I’ve repeated this explanation countless times over the years. The fundamental problem is that most people view Musk as some sort of visionary genius and assume he knows what he’s talking about. Thus, when Musk says things that are fundamentally wrong about automated driving, it becomes gospel for people who are coming from a nontechnical background. It has become my job to correct the misinformation.
Before I explain why no current Tesla vehicle is capable of L5 automation, let me give a brief explanation of what these levels mean. The Society of Automotive Engineers published the J3016 standard in 2014, which defined a taxonomy for levels of automated driving capabilities. Level 0 is for vehicles with no driver-assist capabilities at all. The vast majority of vehicles built today fall within L1, with a variety of discrete functions to assist in the driving task, such as lane departure warning or adaptive cruise control.
Level 2 systems combine multiple functions within a control strategy, such as automatic lane centering and speed control, but still require the driver to pay attention and be prepared to take full control at any time. These systems may or may not require the driver to keep hands on the wheel, but their eyes must remain on the road. As of 2020, Tesla AutoPilot is still an L2 system.
Level 5 (the highest level) refers to fully automated vehicles. These are vehicles that are capable of operating in all conditions without any human supervision or intervention. An L5 vehicle can drive itself in any weather and on any roads that a human can. The next step down (Level 4) can do the same, but within a defined operating domain that can be based on geography, weather, or any other conditions. The vehicles being tested by virtually all other AV companies, including Waymo, Argo, Cruise, Voyage, and others, are all L4.
L4 and L5 vehicles can have human controls, and a driver can take control if they desire, but it’s not required. Both of these systems have levels of redundancy that enable them to get to a safe stop if a problem is detected, even when no one is aboard.
The key distinction between L4 and L5 is the ability to drive anywhere and at any time. To do that, the driving system needs to be able to “see” the world around the vehicle, which means the sensors must remain clean and unobstructed at all times.
Ford sensor cleaning. Video credit: Ford
Most people who are critical of Tesla’s approach to AV technology focus on Musk’s insistence on not using lidar. Tesla vehicles are equipped with eight low-cost cameras, along with a single forward-looking radar sensor and 12 short-range ultrasonic sensors. While this solution is unlikely to yield a sufficiently robust automated driving system anytime in the foreseeable future, it’s not impossible, despite the limitations of cameras and the limited redundancy. Even if Tesla can develop software that can reliably understand the world around the vehicle for a safe L5 system, the cars as they are built today will never be L5.
As of 2020, Tesla AutoPilot is still an L2 system
It all comes back to cleanliness.
Those of us who live in regions that have multiple seasons are well acquainted with the challenges of trying to see out of windows caked in slush or getting illumination from headlamps covered in road salt. Even when the weather is warm, billions of insects are killed every year as a result of windshield impacts, and pollen from trees and plants can build up on all manner of surfaces in addition to irritating sinuses.
Waymo sensor cleaning system. Source: Waymo
Ford, GM, and Waymo all have systems on their automated vehicles designed to minimize the collection of anything that can obscure sensors. While washer nozzles, air curtains, and wipers may seem incredibly mundane compared to neural networks and machine learning, it’s often those mundane details that can trip up an engineering project. Remember when a Mars probe crashed because engineers failed to convert some crucial numbers from English to metric units?
If you can’t keep the sensors of an AV clean, it can quickly be blinded. If the virtual driver can’t see, it can’t plot a course through its environment. If that happens, the vehicle can’t drive and is not L5 automated.
A radar sensor caked in slush and winter road grime after 10 minutes of driving. Photo: Author
Of the eight cameras on current Tesla vehicles, three forward-looking imagers are mounted above the rearview mirror and sit within the swept area of the windshield wipers, which could theoretically keep them unobscured. The other five on the sides and rear have no cleaning system at all. As someone who has been driving in winter conditions for nearly four decades, I can guarantee that all five will get obscured with an array of road grime.
However, even those forward cameras aren’t completely safe. When snow and ice build up, the outer tips of wiper blades frequently get lifted up and don’t wipe the complete area, which just happens to be where those cameras are located. Those insects I mentioned? When a mayfly goes splat on a windshield at 70 mph, I’ve yet to encounter a wiper anywhere that can remove it. Don’t get me started on the streaky mess when you try to wipe away bird droppings.
Even in warmer weather, the insects collected on the front of a vehicle can make sensors unusable. Photo: Author
As a self-proclaimed visionary, Elon Musk may not spend much time considering mundane tasks like scraping bugs and dung off windshields, but they are all a necessary part of life. Machine vision is not exempt from the need for a clear view. In fact, it’s more sensitive to such obscurants than the human brain, which can actually do a shockingly good job of seeing around such annoyances.
Until Elon Musk can handle the mundane, Teslas will never have Level 5 automated driving capability. | https://debugger.medium.com/dead-bugs-are-getting-in-the-way-of-fully-autonomous-self-driving-cars-abf5bdf932fb | ['Sam Abuelsamid'] | 2020-10-19 05:32:07.950000+00:00 | ['Elon Musk', 'Autonomous Cars', 'Transportation', 'Technology', 'Tesla'] |
28 | Quantum Computing and Blockchain | An Honest Introduction to Quantum
Popular science journalism severely overestimates the applications of quantum computing. Quantum computing is fantastic at doing a small, specific subset of calculations, but it won’t be able to magically “try all the answers” in parallel.
At MIMIR, whether we are presenting at conferences or working directly with our growing customer base, we’re often asked by existing companies if incorporating blockchain is worth it for them. One of the most prevalent causes for concern that we hear is that “quantum computing will soon make blockchain obsolete, so why bother advancing the tech?.” Hearing this question enough we’ve decided to put the threat of quantum computing into an honest and realistic perspective. In short, quantum will not be the end of blockchain and it is not as all powerful as you may think it is.
However, there are areas which Quantum computing is likely to disrupt inside of encryption and cryptography. This is true in the case of blockchain technology. Quantum has the ability to crack the encryption and signature schemes used in cryptocurrencies. However, there are several adaptive measures taking place that aim to help prevent the quantum threat.
To understand where the Quantum threat exists, we first need to understand how encryption works. A lot of this will be a gross oversimplification for explanatory reasons.
RSA vs ECC Disclaimer
For the purpose of understanding, we will be using the RSA (Rivest–Shamir–Adleman) cryptosystem. This is the system used by HTTPS. RSA Encryption uses prime numbers and factoring to create a one-way function that makes it easy to encrypt something yet very difficult to reverse-engineer the data from the encryption.
Bitcoin and Ethereum do not use RSA. They use ECC (Elliptic Curve Cryptography). ECC still relies on a one-way function, except instead of using prime numbers and factoring, it relies on points on an elliptic curve. ECC produces much shorter keys and is applicable in the use of signature schemes.
When you are trying to crack a private key through RSA encryption, you are basically trying to find the correct prime factors of a really big number. When you are trying to crack a private key through ECC, you are basically trying to find a “random” point on an elliptic curve. Though they are different, their relationship to Quantum computing is very similar. Whether it is finding factors or points on a curve, quantum computing poses a threat. That being said, the concepts of prime numbers and factoring are much easier to grasp than elliptic curves.
How Encryption Works
Data is sent to your public address. The person sending that data will encrypt the data against your public key. Your public address is the hash of your private key and is basically just a shortened, public, and more convenient version of your public key.
This data can now only be opened by your private key. This level of cryptography is well-understood, but how does it work?
Your public key is basically the product of two very large prime numbers. Your private key contains the two prime numbers that were used to create your public key. Extremely large prime numbers are used.
We will use a relatively small prime number to demonstrate this. We know 13 and 17 are both prime numbers. To multiply them together is easy. But if I asked you to find the prime factors of 221, it would take you a little bit. The only way to do it is to check one by one which prime numbers go into 221 and find the correct pair. Do this with prime numbers that have an enormous number of digits and factors, and you find yourself with a near impossible task.
Cracking current private keys would take modern computers more time to crack than is feasible in a single generation and that is an understatement. Blockchain uses asymmetric security encryption which is hard to crack. They do this by enveloping data inside of computationally difficult math problems. It’s a lot of backwards math, that is difficult for our computers to handle.
That is, until Quantum came along.
Quantum Computing Explained
Traditional computers use bits. Everything is stored as a 1 or 0 or, in other words, as true or false. This happens through teenie tiny switches called logic gates that are basically “on” or “off”.
This binary system means that, when a computer is searching for prime factors or random points on an elliptic curve, it will go through and test each option one by one. This makes certain computations heavily intensive.
Quantum computing replaces bits with qubits. These give quantum computers exponentially improved abilities to do certain mathematical calculations. Qubits have the ability to be in both states (0 or 1, true or false) at the same time. This may sound confusing, but don’t try to rationalize it in your head. Understanding the superposition and interference of qubits is a daunting task for even some of the hardest working physicists out there. But not all hope is lost. Here’s what you need to know:
Qubits will allow computers to use more efficient methods to solve certain computational problems. Shor’s algorithm will allow quantum computing to use far more effective routes when solving these cryptographic challenges. These types of calculations can take several years for a traditional computer to process. A quantum-based computer could optimize its efficiency and solve the same level of complexity in a fraction of the time. This would allow one to crack the private keys on most blockchains quickly and efficiently.
However, Quantum is still far from reaching a point where it poses an immediate threat to blockchain.
The Future of Quantum and Blockchain
It’s not that easy to make a quantum computer. Qubits are highly sensitive and can only properly be used in very specific conditions. Among others, Qubits have to be stored at a temperature extremely close to absolute zero.
To make things worse, the more Qubits you add, the harder it is to maintain the system. As of now, the best thing we have is IBM’s recent 50 Qubit computer. So how many qubits will we need in a computer to defeat encryption?
Estimates show that ECC is actually faced with a more immediate threat than RSA encryption.
Cracking a 224 length ECC key would require 1300 qubits from a quantum computer. The rough RSA equivalent 2048 length key would require about 4096 qubits.
This means that ECC will be one of the first cryptographic schemes to get hit by quantum. RSA however, will likely not be too far behind when you consider the growth rate of this technology.
But blockchain is not doomed. Ethereum, for example, is well underway on their efforts to countermeasure the threat posed by quantum. Quantum resistant keys are already being developed. Short-term solutions, like using quantum to generate truly random numbers within cryptography, are also being looked at. The technical community is well aware of the threat that quantum poses and they are handling it.
So although Quantum may eventually pose a threat to blockchain in the future, it’s not an imminent threat nor one that will come as a surprise. It’s a threat that can be overcome and is being worked on now. At MIMIR, we believe in the developer community for blockchain. And so long as the success of a platform is determined by its developers, we think it’s safe to assume that blockchain is and will remain in good hands.
One year ago, Vitalik, founder of Ethereum, said this: “Ethereum is also going to be optionally quantum-proof with EIP 86 and Casper, because it will support any signature algorithm that the user wants to use.” Blockchain technology is in its infancy and will continue to evolve. For this reason, quantum will not be the end of blockchain.
Contact/Connect with us at:
DISCLAIMER: The content provided on this site is opinion and commentary on topics related to the blockchain universe. IT IS NOT INTENDED TO BE NOR SHOULD IT BE RELIED ON BY YOU FOR ANY REASON AND IS PROVIDED “AS IS” WITH NO WARRANTIES OF ANY KIND. You are responsible for your own decisions and for properly analyzing and verifying any content. | https://medium.com/mimir-blockchain/quantum-computing-and-blockchain-83ea9fdfacb7 | ['Mustafa Inamullah'] | 2018-07-27 15:52:26.155000+00:00 | ['Blockchain Technology', 'Ethereum', 'Quantum Computing', 'Science', 'Bitcoin'] |
29 | Renee Bergeron: Committed to Making Technology Accessible for Businesses of All Size | A Radioshack TRS-80 Personal Computer led Renee Bergeron to technology. This machine arrived at her house through a family friend who gifted it to her since he saw no purpose in it. Toying around with that piece of hardware piqued Renee’s interest in technology empowered her to love technology and finally inspired her to embark to study computer science which led her directly into the technology world.
As an enthusiastic young technologist, Renee was programming for clients like Air Canada, Hudson Bay Cap, National Bank of Canada which allowed her to glance into various businesses and amass knowledge. When presented with an opportunity, she moved to Australia as the CIO for the Bank of Melbourne. Operating in the service provider spectrum in corporate America allowed her to experience technology from the business aspect. As she progressed, her journey with Ingram Micro granted her an insight into technology supply chain management for SMB and building cloud businesses. Currently, she is the Senior Vice President & General Manager of AppSmart.
“In retrospect, my life could have been drastically different if I was gifted with a stethoscope or if I was not passionate about technology in all its form. I am incredibly lucky to work in a technical field that excites me and motivates me,” she asserts.
In an interview with Insights Success, Renee shares her valuable insights on how she is empowering businesses to soar through technology innovation.
Below are the highlights of the interview:
How do you diversify your organization’s offerings to entice the target audience?
We are all consumers of technology in the personal and professional aspects of our life. Consumers are sophisticated and differentiate solutions based on their needs and convenience. So ultimately, as providers, we focus on the customer experience. I invariably put myself in the customers’ shoes to experience our offerings. If you do not appreciate your services as a consumer, why should a customer experience it? Hence, we internalize before we build on it.
Businesses are essentially groups of individuals who employ technology to ease their workload, and I think about their goal to serve them. For instance, if our customers are dentists, as a business leader, my role would be to find what technology would enable dentists to serve their customers best. This process would involve me personally reaching out to my dentist to understand their requirements to build a roadmap of new solutions we can develop. Without the desire to solve customer problems, and deliver the best experience, your solutions do not make an impact.
How do you strategize your game plans to tackle the competition in the market?
The general assumption suggests that tracking your competitors allows you to structure a better plan to get ahead of them. However, I believe that practice only puts you in a box with desperation to redesign the box, without scope for reinvention. Hence, I prefer market analysis; building strategies require market study, understanding complexities, and identifying opportunities.
Data is my best friend. Slashing and slicing the market to understand the organizational type, their growth, and market scope gives me a unique edge. The market study also helps us predict future trends and equips us to transform the space. Looking at the overall space rather than constantly be engrossed by competition is essential to play the long game in technology.
What are the vital traits that every businesswoman should possess?
Intensity determines your success. Intensity is the key to success and is required to consistently and reinvent customer experience. Perseverance has been one of my greatest strengths over the years. However, investing the energy to consistently deliver and maintain pace requires intensity and passion. Perseverance has been one of my greatest strengths even at my lowest points.
As a leader, a positive mental attitude is critical. Imbibing an optimistic outlook creates an uplifting environment for your team to deliver their best results. Optimism is contagious and should be adopted by every business executive for motivating oneself and others.
Lastly, I follow True North; a concept of undertaking decisions based on data rather than emotions. We are humans with biases, and business judgements made with emotions ultimately fail us. Data, however, are facts. Facts are essential for business development and sustainability. In my opinion this one of the most important characteristics one must strictly follow.
As per your opinion, what roadblocks or challenges were faced by you in a corporate business? And how did you overcome them?
My journey in this industry allowed me to experience some mainstream challenges, which I witnessed were the same obstacles faced by my peers irrespective of gender and nationality; like a board rejecting your proposal. In my opinion, these are roadblocks that are essential to transform into a leader. The most dangerous roadblock we face is the limitations we create for ourselves. Constantly questioning our capabilities, judging our experience, and assuming perception hampers inculcating a growth mindset.
However, there are specific challenges, like communication gaps which could affect your performance and growth chart. In an instance, my manager was an outspoken individual to everybody on the team except me. The anger, the appreciation and the guidance were all out there for them, while I received passive-aggressive feedback leading to confusion.
Transparent communication was the key to identify the root cause. He assumed that his being direct would make me emotional since I am a woman. This was an eye-opening perspective but my solution was to encourage him to accept my emotional response, if any. I presented him with piles of paper which said, ‘This gives you a right to make me emotional’, and normalized something he was terrified of. My communication abilities changed the relationship which led to him mentoring and providing growth opportunities for me. The intention of this story is to promote honest conversations to address awkward undefined problems.
What are your insights on “The myth of meritocracy”? And how it could bring a change in today’s business arena?
Working hard is a basic qualifier in the industry today. Beyond that, one must understand the stakeholders and their requirements. It is about demonstrating the ability to do your job under any circumstance.
Often people overlook the dimensions of ‘what’ and ‘how.’ Our industry houses many credible and hardworking experts who are good at ‘what’ they do. But ‘how’ they do their work sets them apart; not all hard-working individuals are agile, team-players, or the most efficient workers.
Personally, your ability to present yourself also adds value. People will fail to recognize hard work if it is done behind closed doors. It is vital to draw a roadmap, present your outcomes and secure a sponsor, who will vouch for you. You have to chart a path for yourself surrounded by the right people that will help you expand your arena.
How do you cope up with capricious IT and other technological trends to boost your personal growth?
Technology forces change. As consumer demand increases, the market dynamics shift, propelling change; this cycle constantly helps me evolve. Being in this space, you inevitably embrace the role of being a change agent; which affects personal life as well. As we age, we are required to work our brain to stay healthy, and this environment of continual shift, helps us do just that.
Seldom do we focus on the real forces behind this constant change, people. It is a beautiful cycle between humans and technology; one change forces the other.
What are your future endeavors/objectives and where do you see yourself in the near future?
The perk of operating in the technology industry is, my future can take any glorious form. While I know what the next few years will look like on a broader spectrum, it will be exciting to be on that journey, as new opportunities emerge. The goals I intend to pursue, making technology accessible for businesses of sizes, will be constant; but my path to reach there will definitely take different forms introducing me to bigger and better experiences. | https://medium.com/@insightssuccess/renee-bergeron-committed-to-making-technology-accessible-for-businesses-of-all-size-441552c7f67d | ['Insights Success'] | 2020-11-25 13:07:01.582000+00:00 | ['Women In Tech', 'Female Entrepreneurs', 'Women In Business', 'Technology'] |
End of preview. Expand
in Dataset Viewer.
Sample with the keyword "Technology" taken from https://huggingface.co/datasets/fabiochiu/medium-articles
- Downloads last month
- 84
Size of downloaded dataset files:
17.8 MB
Size of the auto-converted Parquet files:
10.4 MB
Number of rows:
3,000