2016/03/07 14:12:37
stevec
pwalpwal
bapu
What would analytics show about this thread?


depends on the metrics defined




And only if they're measured in inches.
 
2016/03/07 14:36:32
pwalpwal
stevec
pwalpwal
bapu
What would analytics show about this thread?


depends on the metrics defined




And only if they're measured in inches.
 


and distributed ergonomically, of course


2016/03/07 14:38:08
Anderton
brconflict
I'll give credit to CW for at least announcing this, vs. tossing it in without you knowing or being able to opt out. NOt everyone is this transparent.

 
FWIW I believe Abeleton Live's EULA mentions that analytics are built-in, although you can opt out. I'm sure they're not trying to put anything over on anyone, they probably figure it's as boring to users as saying that the program uses an installer. Analytics are widely accepted (to the point where I think they're TOO accepted in some ways), so I applaud Cakewalk for bringing this to the attention of the community.
 
(BTW I received a survey from Microsoft today about Windows 10. I of course filled it out, but I'm probably the only person who mentioned "Multi-client MIDI" as one of my favorite OS features in W10. )
 
 
2016/03/07 14:42:10
Anderton
Ryan Munnis [Cakewalk]
"How often do customers experience dropouts?"
 
Honestly, I'd love to know.
 
The answer to this question is always extremely subjective. Support representatives may say "quite often" because they're often on phone calls with customers using integrated sound cards with poor performing drivers before they've learned how to configure SONAR for use with their new audio hardware. QA might say, "occasionally" because they're used to testing and working with beta-testers who own and work with superior hardware but also know beta-builds can be unpredictable from time to time. Developers themselves might say "never" because optimizing their system for audio performance is completely second nature. End users themselves will give a different answer every time based on their own subjective experience.
 
The term "dropout" itself is also interpreted a few different ways. Often times customers report simply, "I get tons of dropouts", but what exactly are they referring to? Are they referring to clicking and popping during playback, or are they referring to the audio engine stopping? We've even had users refer to timeline intentionally stopping at the project end as a "dropout" (yes, this is true). So if a email/call/bug report comes through like that, what do we do with that data? Is it factual to refer to that as a "dropout", or do we make a clear distinction? How do we build an accurate report of whether or not dropouts are a plaguing customers? How do we build an accurate report of whether or not a particular build of SONAR we just released has increased or decreased the number of dropouts end-users are experiencing?
 
The problem is that nobody can really give the developers a helpful answer here. It's usually vague, very subjective, and lacking any specifics helpful to troubleshooting and making improvements.

 
Thanks. I think this is the clearest explanation yet on why Cakewalk wants to do this.
 
2016/03/07 15:48:49
Beepster
to the point where I think they're TOO accepted in some ways
 
Yup.
 
If Cake or anyone else gives a crap about what ole Beeps has to say (without flinging poop at him or telling him to essentially STFU) then mayhaps I'll take some time to type something up.
 
In the past couple weeks though I've gone from "ZOMG!! I HAVE TO FIND MONIEZ TO STAY UP TO DATE WITH SONAR" to "Meh... we'll see".
 
I doubt I am alone.
 
I will however continue to help others when I can with the version I have.
2016/03/07 16:06:17
Paul P
brconflict
I'll give credit to CW for at least announcing this, vs. tossing it in without you knowing or being able to opt out. NOt everyone is this transparent.



I agree that it was a good idea for Cakewalk to announce this before implementation and thereby save us from a major surprise on release day, like happened last time around.  This way we are forewarned, can make deliberate decisions and have gotten our opinions out of the way.  Release day should be a quiet affair.
2016/03/07 18:24:22
stevec
+1   It's almost as though... CW is making adjustments, and learning how to better accommodate its users.  At least here on the forum.   
2016/03/07 20:03:40
Brando
Anderton
Ryan Munnis [Cakewalk]
"How often do customers experience dropouts?"
 
Honestly, I'd love to know.
 
The answer to this question is always extremely subjective. Support representatives may say "quite often" because they're often on phone calls with customers using integrated sound cards with poor performing drivers before they've learned how to configure SONAR for use with their new audio hardware. QA might say, "occasionally" because they're used to testing and working with beta-testers who own and work with superior hardware but also know beta-builds can be unpredictable from time to time. Developers themselves might say "never" because optimizing their system for audio performance is completely second nature. End users themselves will give a different answer every time based on their own subjective experience.
 
The term "dropout" itself is also interpreted a few different ways. Often times customers report simply, "I get tons of dropouts", but what exactly are they referring to? Are they referring to clicking and popping during playback, or are they referring to the audio engine stopping? We've even had users refer to timeline intentionally stopping at the project end as a "dropout" (yes, this is true). So if a email/call/bug report comes through like that, what do we do with that data? Is it factual to refer to that as a "dropout", or do we make a clear distinction? How do we build an accurate report of whether or not dropouts are a plaguing customers? How do we build an accurate report of whether or not a particular build of SONAR we just released has increased or decreased the number of dropouts end-users are experiencing?
 
The problem is that nobody can really give the developers a helpful answer here. It's usually vague, very subjective, and lacking any specifics helpful to troubleshooting and making improvements.

 
Thanks. I think this is the clearest explanation yet on why Cakewalk wants to do this.
 

+1- enough for me to soften my earlier position. Makes sense, and there are likely other similar scenarios where this approach makes a lot of sense. Especially for benchmarking new releases performance versus prior releases. I still would want to test it to prove to myself there is really no overhead or glitches. IMO though, using analytics to mine for potential workflow improvements is still a mistake - for the reasons already outlined. Just my 2 cents.
2016/03/07 20:36:08
John T
I think I'd say that using only analytics to mine for potential workflow improvements would be a mistake. But that doesn't seem to be what's being proposed by cakewalk.
2016/03/07 23:32:48
ampfixer
Am I wrong to think that the feedback portal and the analytics will be tied together? Sounds like a good place to opt in or out.
© 2024 APG vNext Commercial Version 5.1

Use My Existing Forum Account

Use My Social Media Account