2016/03/07 03:38:28
BobF
Noel Borthwick [Cakewalk]
BTW to clarify Cakewalk analytics in itself is NOT a feature. Rather its an internal tool that is intended to help us with other features. Its not unlike the older "usage logger" that I mentioned in this thread earlier.
The sole purpose of posting it in this forum was to notify people of how it works and not to advertise it as a feature.




Noel - This is intended to clarify some of the resistance/negativity you're seeing.  I'm presenting this with all due respect from the perspective of a user that is anxious to see Sonar and Cakewalk be successful in the long term.
 
I am personally not concerned about Cakewalk invading my privacy with this new tool.  I'm not concerned about potential performance problems either.  I am disappointed that effort is being expended to integrate and deploy a tool that is supposed to help Cakewalk figure out what needs to be done - where to focus work.
 
This disappoints me because there is already a huge list of things that need to be addressed in Sonar.  It's not like Sonar has reached a plateau of feature richness and stability.  Seeing dev staff time (funded by me and other users) expended on a tool to help you figure out what needs to be done, when the existing tools appear to have given quite a list already, is where the negative vibes are coming from.
 
Please don't let this point get lost in the peer-bashing against those speaking out.  If you would like me to go thru the FR forum and produce a list that demonstrates the time and effort folks have put into providing feedback I'll be happy to do that for you.  There is some pure gold in that forum.  It may have originated from a small sampling of Sonar users, but I would be willing to bet a handsome sum that there are quite a few items on that list that are no-brainers across the spectrum of Sonar users.
 
Instead of monster effort to clear some back-log, we're seeing an apparent time out to implement yet another way for Cakewalk to get feedback to help decide what to do next.  Really?  Again, if you would like me to go thru the list, I would be more than happy to do that.
 
All things considered, you guys did a GREAT job in 2015.  You met a huge wall of skepticism about your new model with solid results.  You balanced the need to implement new features with the need to address bugs and workflow improvements.  However, there is still a lot to do as far as I'm concerned.  And the bulk of that is clearly visible, clearly identified and past due.
 
PLEASE don't let distractions slow down the momentum. 
 
 
2016/03/07 03:47:25
tenfoot
bapu
The only thing about analytics I'm paranoid about is that Cakewalk discovers I work in the nude.


In the midst of all the tension, yet again Bapu makes me laugh out loud. You are hilarious dude:)
2016/03/07 11:55:26
ChristopherM
Anderton
Chrome thinks I want dog sweaters, leashes, and chewy treats.
 

 
You'll be surprised at how good you look in a stylish dog sweater (although the leashes are for special tastes only). The treats aren't called treats for nothing, either.
2016/03/07 12:26:31
stevec
ChristopherM
Anderton
Chrome thinks I want dog sweaters, leashes, and chewy treats.
 

 
You'll be surprised at how good you look in a stylish dog sweater (although the leashes are for special tastes only). The treats aren't called treats for nothing, either.




Wonderful....    thanks for that mental image. 
 

 
2016/03/07 12:35:16
brconflict
Something that may need to be adjusted in our understanding of this process of gathering: From what I've gathered myself, CW is interested in what Sonar is doing and perhaps some of the attributes Sonar is already seeing on your system (think of the Performance module in the Control Bar). To put it into perspective, this is a way where Sonar can maybe show CW the features inside Sonar customers (not just you) use the most or least. As many of you are aware, CW bakers have limited amount of time/resources to address every little detail of Sonar in every new version. But what if they could see that the bulk of users never use, say the Step-Sequencer. It could certainly let them know not to spend a whole lot of time there when there's other areas of Sonar that might better suit the masses in the next update. Would it be useful for CW to know what types of things you do most, and tailor their efforts to make those tasks work more fluently or efficiently? That's possible.
 
Analytics, in general is part of modern-day technology, and we've ALL adapted. For example, when you use Google, if your browser allows them to track you (they do, by default), Ads that show up are ones for sites you've visited, not male enhancement. Would you rather see Ads for new plug-ins or new pills? That's the general idea with Analytics but here it only pertains to Sonar, and maybe Sonar uses or ancillary things around Sonar. Truly CW has no interest in whether you use a password program, bank software, and other such things you wouldn't even trust Microsoft knowing, let alone your kids. However, Microsoft wants much of that information. CW doesn't.
 
Now, let's put this into perspective of Windows 10. Windows 10 wants all this info and more--MUCH more. You can disable that information as well. Do you accept Microsoft doing this? Who would you trust more with ANYTHING collected for Analytics? the OS manufacturer, who is bartered with frequently to gather analytics, or Cakewalk, who is not the only game in town and is careful not to alienate its customers?
 
And, don't forget! If you don't want to participate, you don't have to. Turn it off. It's that easy!
 
I'll give credit to CW for at least announcing this, vs. tossing it in without you knowing or being able to opt out. NOt everyone is this transparent.
2016/03/07 12:40:10
BobF
brconflict
 
And, don't forget! If you don't want to participate, you don't have to. Turn it off. It's that easy!
 
I'll give credit to CW for at least announcing this, vs. tossing it in without you knowing or being able to opt out. NOt everyone is this transparent.




Yes, they do deserve credit for this.  Advance announcement and optional levels are very positive moves.
2016/03/07 13:33:01
bapu
What would analytics show about this thread?
2016/03/07 13:47:06
Ryan Munnis [Cakewalk]
 
BobF
I am disappointed that effort is being expended to integrate and deploy a tool that is supposed to help Cakewalk figure out what needs to be done - where to focus work.

I'd argue you're misinterpreting the effort here, or at least missing the mark on how this could be helpful (whereas other current avenues of communication are not). As an example:
 
Noel mentioned "dropouts", so I'm sort of just reacting to that considering conversations I've been in repeatedly behind closed doors here. For years our developers have asked questions to those that have close interactions with end-users things like:
 
"What are the most common causes of crashes that users experience?"
 
Previously we only knew if people reported them directly. We worked towards resolution to this by creating the fault reporter when SONAR crashes. Collecting, parsing, analyzing, and reporting crashes via this system is all automated. This has resulted in countless stability fixes. Of course crash reporting can all be done manually as well, but the amount of fixes that required zero human action (other then a developer fixing the crash) has proven to have been well worth the effort.
 
"What would you say are the most common questions customers call about for assistance?"
 
We have a giant database of phone call & email history in our internal ticketing system that allow our support team to easily reference what the big ticket items are. The forum itself is also offers a ton of insight into user hangups. It's pretty easy to come to a conclusion on what users often need help with.
 
"What are the most common bugs users run into?"
 
We created a Problem Reporter so that users and Cakewalk staff alike could log, report internally, and provide notification of bug resolutions directly to end users. This has resulted in countless fixes over the years. We're always working towards improvements to this, but even today it directly integrates with our internal bug tracking software and fault reporting system.
 
"What are the most requested features made by users"
 
We currently have this Feature Request forum and are building a new Feedback Portal to improve upon this experience overall. In the past it was just a suggestion inbox. We're working on making this much better.
 
etc. etc.
 
 
My point is mentioning this is that we're always trying to deliver information from end-users to development in a more efficient manner. Those are only a few brief examples. But here's the thing, Cakewalk developers often also ask things like:
 
"How often do customers experience dropouts?"
 
Honestly, I'd love to know.
 
The answer to this question is always extremely subjective. Support representatives may say "quite often" because they're often on phone calls with customers using integrated sound cards with poor performing drivers before they've learned how to configure SONAR for use with their new audio hardware. QA might say, "occasionally" because they're used to testing and working with beta-testers who own and work with superior hardware but also know beta-builds can be unpredictable from time to time. Developers themselves might say "never" because optimizing their system for audio performance is completely second nature. End users themselves will give a different answer every time based on their own subjective experience.
 
The term "dropout" itself is also interpreted a few different ways. Often times customers report simply, "I get tons of dropouts", but what exactly are they referring to? Are they referring to clicking and popping during playback, or are they referring to the audio engine stopping? We've even had users refer to timeline intentionally stopping at the project end as a "dropout" (yes, this is true). So if a email/call/bug report comes through like that, what do we do with that data? Is it factual to refer to that as a "dropout", or do we make a clear distinction? How do we build an accurate report of whether or not dropouts are a plaguing customers? How do we build an accurate report of whether or not a particular build of SONAR we just released has increased or decreased the number of dropouts end-users are experiencing?
 
The problem is that nobody can really give the developers a helpful answer here. It's usually vague, very subjective, and lacking any specifics helpful to troubleshooting and making improvements.
 
Noel mentions an example of doing cool & helpful things to assist someone who many be experiencing recurring dropouts. This is part of the spirit of analytics. Seems like a lot of people are fearful that this is being implemented while feature requests right here on this forum are out in plain sight, but I'd argue that none of the previous systems we have in place can provide an answer to something like "how often do people experience audio dropouts". Very different goal if you ask me.
 
I'd also argue that as of today, it's very hard for us to communicate whether or not SONAR Newburyport experiences fewer dropouts then SONAR Braintree. I could dig up a report in regards to crash stability (because of our aforementioned fault reporter), but digging up a report in regards to audio engine performance would require benchmarking. In other words - much smaller set of data; difficult to get real metrics outside of control group; takes up lots very valuable time.
 
So I guess my ultimate argument is that effort in this will provide us with insight not currently (nor easily) available to us. It doesn't replace other areas we look to for insight.
2016/03/07 13:54:18
BobF
I honestly hope Cakewalk gets exactly what they need from this.
2016/03/07 14:07:36
pwalpwal
bapu
What would analytics show about this thread?


depends on the metrics defined
© 2024 APG vNext Commercial Version 5.1

Use My Existing Forum Account

Use My Social Media Account